id
stringlengths
32
32
url
stringlengths
31
1.58k
title
stringlengths
0
1.02k
contents
stringlengths
92
1.17M
0b6d8acee9062417390643a948224e91
https://www.smithsonianmag.com/history/extraordinary-discoveries-1507630/
Extraordinary Discoveries
Extraordinary Discoveries It took Chip Brown a couple of days to come to grips with El Mirador, the overgrown Maya city in the Guatemalan jungle that dwarfs the better-known Tikal. “Much is still buried,” he explains. “You have to stare at the topography awhile before you can quite let go of the idea that the contours and the hills and the little dales are not natural but reflect the buried remnants of a ruined city. You just have to overcome this blind spot about how utterly this has all been obliterated.” Although El Mirador was discovered some 85 years ago, most of the 15-square-mile site, abandoned nearly 2,000 years ago, has yet to be excavated. “Once you tune into the geography and the topography of the place,” adds Brown, whose cover story, “Lost City of the Maya,” begins on page 36, “then it’s really one of the most extraordinary places in the world.” And one of the most precarious. “I hope people will recognize what’s priceless about the place and not let it slip away,” says Brown, a much sought-after magazine journalist and the author of two books. “It’s very easy to lose what’s there and impossible to replace it.” Before he became executive editor of this magazine, Terence Monmaney wrote with distinction about science and medicine for Newsweek, the New Yorker and the Los Angeles Times, from which we plucked him nearly a decade ago. His story “The Triumph of Dr. Druker” documents a remarkable breakthrough in the treatment of chronic myeloid leukemia (CML), a deadly cancer. It also profiles the man most responsible for the therapy, Brian Druker. “When I started off, I didn’t know anything about him except he had made this great contribution,” says Monmaney. “The closer I got, the more interesting he became. There is something about his mind that insists on a kind of clarity and straightforwardness, which is part of the reason for his success.” Monmaney recognizes the need to strike a balance between realism about cancer and the prospects for treatments. “At the same time,” he says, “there is a tremendous amount of optimism, much of it because of Druker’s achievement. It’s worth focusing on that, not to get hopes up too high but to call attention to real progress that is the result of people working really hard to come up with new approaches.” CML is an unusual kind of cancer, and for that reason skeptics say the lessons learned in treating it may not be applicable to other kinds. Says Monmaney: “Druker will tell you, ‘Yeah, it’s true. Other cancers are more complicated. But it’s just a matter of time before we figure those out, too.’” Carey Winfrey was Smithsonian magazine's editor in chief for ten years, from 2001 to 2011.
56c4d31403a08ca9ce39f279d5b96033
https://www.smithsonianmag.com/history/facing-a-bumpy-history-144497373/
Facing a Bumpy History
Facing a Bumpy History In London in 1873, Mark Twain saw an advertisement for the services of a fellow American who had hung out a shingle on Fleet Street. At once inspired and skeptical, Twain made his way to the offices of Lorenzo N. Fowler, "practical phrenologist." "I found Fowler on duty," Twain wrote, "amidst the impressive symbols of his trade. On brackets, on tables . . . all about the room, stood marble-white busts, hairless, every inch of the skull occupied by a shallow bump, and every bump labeled with its imposing name, in black letters." During the 19th century, thousands of busts like those Twain described were manufactured and sold by Fowler and others. One of them — its surfaces inked with lines showing the location of such traits as "Conjugality" and "Combativeness" — is on display at the American History Museum's "Science in American Life" exhibit, surrounded by other measures of human intellect and personality. According to the "science" of phrenology, an individual's character and abilities could be deduced from the size and shape of various bumps on the head. By the time Twain visited Fowler, phrenology had developed an enormous following, especially in America. Characteristics such as verbal memory, "Amativeness" and "Secretiveness" were supposed to be controlled by corresponding areas, or "organs," of the brain. The more developed the trait, the larger the organ, and the larger a protrusion it formed in the skull. Phrenologists also believed that such traits — and their respective organs — could be modified through the practice of restraint or by the conscious "exercise" of a positive quality. In the 20th century, phrenological busts have become comic conversation pieces, their images often used to patronize the past. Phrenology's failings are indeed obvious, but in our modern dismissal of it, its tremendous impact on 19th century society can easily be forgotten. And despite its shaky scientific foundations, phrenology is enjoying a measure of respect from those who study the brain today. Like another theory of mind that later permeated American culture, phrenology was the brainchild of a Viennese physician fascinated by the human psyche. Even as a schoolboy in the late 1700s, Franz Joseph Gall noticed that classmates who could memorize long passages with ease all seemed to have prominent eyes and large foreheads. From this he inferred that an organ of verbal memory must lie behind the eyes. He speculated that if one ability was "indicated by an external feature," others might be also. His expanded theory brought Gall renown, but also the disapproval of church authorities, who considered such ideas heretical. In 1802, the state prohibited him from promoting his theory in Austria. Not surprisingly, this only increased public interest. Gall began lecturing throughout Europe and in 1805, with his protégé and former student, Johann Kaspar Spurzheim, he left Austria for good. In the early years of the 19th century, Gall's ideas spread across Europe. But it was in America, a country starved for a "scientific" insight into the human mind (and one that offered the hope of individual perfectibility — read "self-help"), that phrenology would find its most devoted and enduring audience. And it was Spurzheim, having further expanded Gall's theory and adopted the name "phrenology," who would bring it to our shores. Spurzheim arrived in 1832 for a whirlwind lecture tour — one that literally killed him after just six months. But in that short time, he converted thousands, lecturing at Harvard and Yale, and across the American heartland. Ralph Waldo Emerson described him as one of the world's greatest minds. After Spurzheim's death, John James Audubon sketched his remains for posterity; Harvard president Josiah Quincy handled his funeral arrangements. "The prophet is gone," the American Journal of Medical Sciences declared, "but his mantle is upon us." The mantle fell, in large part, to a ministry student named Orson Fowler, who suddenly found his true calling in Spurzheim's theory and polemical practice. Fowler began to lecture on the topic to his classmates at Amherst College in Massachusetts, and to offer "readings" for 2 cents apiece. In one friend, the future Rev. Henry Ward Beecher, Fowler reported finding evidence of a "strong social brain" with "very large Benevolence." Orson's enthusiasm infected his younger brother, Lorenzo, along with the rest of the family. The two Fowler brothers — frustrated evangelists both — began touring the country, carrying "the truth of phrenology" from town to town, lecturing and offering readings, analyzing the character and pro-pensities of utter strangers from the bumps and valleys on their skulls. (In one of his early sessions, Lorenzo Fowler studied the head of a shy 15-year-old named Clara Barton. Years later, in her memoirs, the founder of the American Red Cross recalled Fowler's comments: "She will never assert herself for herself — she will suffer wrong first — but for others she will be fearless.") America quickly became cranium-conscious. Employers advertised for workers with particular phrenological profiles — even asking for a reading by the Fowlers as a reference. Women began changing their hairstyles to show off their more flattering phrenological features. Everyone, from small-town folk to the rich and famous, sat for readings, including such notables as Horace Greeley and Brigham Young. (Predictably, P.T. Barnum scored high in all traits but "Cautiousness.") By the 1840s, the Fowlers' New York office, known as the Phrenological Cabinet, had become one of the most visited attractions in town, serving as a bizarre museum that included phrenological portraits of hundreds of famous people's heads. (At least one of them was specially commissioned, post-mortem. After the 1836 death of Aaron Burr, the Fowlers ordered a cast of the deceased's head, and found, upon examination, that Burr's organs of "Secretiveness" and "Destructiveness" were- — not surprisingly — far larger than those of the average person.) As publishers, the Fowlers churned out the American Phrenological Journal and Miscellany (which remained in print until 1911), along with countless volumes on phrenology and its applications to health and happiness, including guides to phrenological parenting and the proper choice of a mate. They also printed the first volume by a young writer named Walt Whitman. When Emerson, after reading a manuscript of Leaves of Grass, famously wrote to its author, "I greet you at the beginning of a great career," the letter was addressed in care of the Fowlers. In the book itself, the Fowlers' influence is clear: "Who are you indeed who would talk or sing of America?" Whitman wrote. "Have you . . . learn'd the . . . phrenology . . . of the land?" So pleased was Whitman with his own phrenological reading ("large hope and comparison . . . and causality") that he would quote it time and time again in his writings. Edgar Allan Poe also regularly wove phrenological concepts into his work, even employing cranial descriptions in an 1850 series of sketches of New York literary figures. (Of William Cullen Bryant, he wrote, the "forehead is broad, with prominent organs of Ideality.") Charlotte Brontë's work is also laced with phrenological analyses. Herman Melville's Moby Dick even offers a lengthy (albeit mocking) phrenological description of the great whale. Because phrenological theory espoused the idea of perfectibility, social reformers quickly latched onto it. Horace Mann regarded phrenology as the greatest discovery of the age. The Fowlers themselves became vocal advocates of reform and self-improvement, sometimes through advice on the proper phrenological choice of a career, but also with regard to education, temperance, even prison reform. Of course, there were always skeptics--not least of them, Mark Twain, who recounted with horror that Fowler had found on his skull "a cavity" where humor ought to be. John Quincy Adams is said to have wondered how two phrenologists could look each other in the eye without laughing. But phrenology sailed on, pretty much unscathed, and until the turn of the century, continued to have an enormous impact on the public's ideas about the mind. So pervasive was it that as late as 1888, the editors of the Encyclopaedia Britannica, wanting to debunk it in the name of reason (not to mention common sense), felt compelled to publish a detailed, seven-page refutation of it. Gall's "so called organs," the Britannica declared, "were for the most part identified on slender grounds . . . made by an induction from very limited data." In some cases, the exponents of phrenology "have discovered coincidences of a surprising nature." But more often than not, such coincidences did not occur, and, the Britannica complained, when they did not, the phrenologists were apt to simply rationalize away the inconsistencies. By the 20th century, phrenology had lost any shred of scientific authority, except among a few diehards. But the Britannica had included in its lengthy attack a perceptive prediction: "Based, like many other artificial philosophies, on an admixture of assumption and truth, certain parts will survive and become incorporated into scientific psychology, while the rest will in due course come to be relegated to the limbo of effete heresies." And so it proved. Though phrenology fell into deserved disrepute, modern scientists note that in some ways it was remarkably prescient. As early as 1929, in his History of Experimental Psychology, Harvard psychologist Edwin G. Boring wrote that "it is almost correct to say that scientific psychology was born of phrenology, out of wedlock with science." It had, after all, an understanding that physiological characteristics of the brain influence behavior and — conversely — that behavior can alter our very physiology. (Of course, today scientists look at changes in neurochemistry and synaptic connections rather than "brain organs," but the principle is the same.) Phrenologists also reckoned that the mind is not unitary but composed of independent faculties. Their ideas — in other guises — have since given birth to the field of cognitive psychology, which breaks down mental functions (such as reading) into separate faculties (letter recognition, sentence comprehension and so forth). Perhaps most interesting is the idea that different mental functions are localized in the brain. One of the first scientists to provide evidence of this localization of function was a contemporary of the Fowlers. In 1861, Paul Broca, a French surgeon and anthropologist, showed that damage to a particular region of the brain — only about four square centimeters in size — can make a person unable to speak coherently, without affecting his or her comprehension of others' speech. "The phrenologists were definitely on the right track in that regard," says Marcus Raichle, a neuroscientist at Washington University in St. Louis. "The problem is where they took it." According to Antonio Damasio, a neuroscientist at the University of Iowa College of Medicine, the phrenologists were, in many ways, "quite astounding" for their time. "However, they did not understand that even the areas we have identified — quite different from their 'organs' — are interdependent parts of larger 'brain systems.'" Damasio, who studies the effects of lesions in the brain, believes he has located an area in the prefrontal cortex that is part of a system crucial to controlling inappropriate behavior and considering the emotional repercussions of one's actions. One of the most dramatic cases he has studied provides a suggestive link between 19th-century phrenology and modern neuroscience. It involves a New England railroad worker named Phineas Gage who, in 1848, suffered an amazing accident: an iron bar, more than an inch in diameter, was thrust by an explosion through his brain, entering his head under his cheekbone and exiting at the top of his skull. That he lived was astounding; even more remarkable, his reasoning and language were left entirely intact. What changed, however, was his temperament. Previously a responsible, gentle man, Gage was now argumentative, irresponsible and prone to cursing so vilely that women were warned not to remain in his presence. Using Gage's actual skull as a guide, Damasio and his wife, Hanna, a fellow neuroscientist, recently created a 3-D computer image of Gage's injury. The bar's trajectory, they found, had damaged the same region of the brain as had been injured in patients of theirs who exhibited similar behavior. Back in 1848, the diagnosis was only somewhat different. Along with all the doctors and journalists who came to observe him, Gage was visited by Nelson Sizer, a phrenology expert and associate of the Fowlers. The meeting provides further evidence that faulty logic can sometimes lead to correct conclusions. After comparing Gage's exit wound with his phrenological charts, Sizer determined — and accurately, no doubt- that Gage's change in demeanor, his violence and rudeness, were due not to damage in the prefrontal cortex but to an injury "in the neighborhood of Benevolence and the front part of Veneration."
89f2e99f9c8b0bfa8fb94cfe678610b2
https://www.smithsonianmag.com/history/factory-oreos-built-180969121/
The Factory That Oreos Built
The Factory That Oreos Built If walls could speak, the brick at New York’s Chelsea Market would have more than a few stories to tell. Alphabet (the parent company of Google) purchased the building in March of 2018 for $2.4 billion—an earth-shattering figure even in New York City’s real estate market—but this isn’t a glittering, 21st-century beacon, a symbol of the ingenuity of Silicon Valley. In reality, the looming brick structure remains largely the same as it did more than a century ago, when it served as headquarters for the iconic snack company Nabisco. Traces of the building’s storied past are still visible throughout the modern-day food hall and tourist hub. Faded murals depict “Oreo Sandwich” and the iconic Uneeda Biscuit boy in his emblematic yellow slicker and rain hat holding a tin of biscuits—an ode both to Nabisco’s innovations in packaging (Uneeda was the first prepackaged biscuit, thanks to patented In-Er-Seal technology) and advertising (it signaled the first multi-million dollar ad campaign). “Although New York has a richer history than any other American city, it does very little to preserve or memorialize its past,” says John Baick, professor of history at Western New England University, where he teaches a course on New York City history. “But New York does not simply bulldoze history, at least not when something can be repurposed, And the new Google building represents another stage in the city’s history, as industrial was replaced by the service industry, which will be replaced by the tech industry.” The building got its start in 1890 after a number of local bakeries merged to create the New York Biscuit Company and constructed an array of six-story Romanesque-style bakeries. Designed by Romeyn & Stever, they were built along Tenth Avenue between 15th and 16th Streets in the city’s Chelsea neighborhood, named after the estate that stood on that land in colonial times. In 1898 the company amalgamated once again, this time with its Chicago-based competitor, the American Biscuit and Manufacturing Company. They called their new venture National Biscuit Company, which “supporters called Nabisco and opponents labeled the ‘Cracker Trust,’” according to historian Mike Wallace in Greater Gotham: A History of New York City from 1898 to 1919. Over the course of the next year, Nabsico—led by the fastidious co-founder and future company president Adolphus W. Green—worked tirelessly to introduce a new product that would set their freshly created company on the path to success. That product? Uneeda Biscuits. Green—a workaholic to the extreme—was something of a prescient businessman and understood the importance of freshness, consistency, branding and advertising long before they were the norm and the marketing of Uneeda Biscuits reflected his approach. To go along with their new production goals, Nabisco staff architect Albert G. Zimmerman designed additional baking facilities adjacent to the original New York Biscuit Company bakeries, and soon added four fireproof structures—two of which were solely devoted to baking Uneeda Biscuits, while another was for Nabisco Sugar Wafers. The new complex opened with great fanfare. “When the Uneeda Biscuit plant had been completed in New York City in May 1899, National Biscuit Company employees had proudly paraded through the streets, boasting of the opening of the biggest bakery ever,” wrote William Cahn in Out of the Cracker Barrel: The Nabisco Story from Animal Crackers to Zuzus. “A platoon of mounted policemen cleared the way for the procession, headed by the 23rd Regiment band and followed by no less than 112 gaily decked horse-drawn bakery wagons, each bearing the words ‘Uneeda Biscuit.’ There were floats, too, one representing the famous Ferris wheel with huge Uneeda Biscuit boxes for cars. Another carried an immense parrot, nine feet high, holding in one of its talons a proportionately large Uneeda Biscuit.” In 1906 Nabisco moved its corporate headquarters from Chicago to New York City—the country’s financial center—and as demand for its products grew, so did their facilities. They continued to expand by adding onto the bakery complex until it took up a full city block, as well as building new structures and buying up nearby ones—not dissimilar from Google’s practice today. Green’s penchant for innovation—and micro-management—spilled out into building design. Rather than stick with the typical “mill building” architectural style, Cahn notes that “he had no patience with such outworn patterns; NBC’s new bakeries were to pioneer in certain construction innovations. He would hound his engineers for new ideas that would create a neater and more orderly appearance.” When its network of Chelsea neighborhood bakeries was momentarily finished in 1913—the same year the Ford Motor Company began using moving assembly lines in its car production—Nabisco laid claim to the world’s largest bakery. “With 114 bakeries and a capital of $55 million, the corporation transformed cookie and cracker manufacturing…”, writes Wallace. But, like the company itself, the New York facilities had to constantly shift to meet the needs of the marketplace. In the 1930s, Nabisco altered the buildings to accommodate the freight railway that now went right through the building, which had the benefit of allowing direct access to train deliveries (“it was probably the only factory at the time constructed to permit a New York Central Railroad train to actually run through the plant to pick up and deliver freight,” points out Cahn). It was in these bakeries where Oreos—the now ubiquitous cream-filled chocolate sandwich cookies—were first invented and produced in 1912. A stretch of Ninth Avenue was even designated “OREO Way” in 2002 to honor what could easily be described as a momentous occasion in culinary history (popular from the get-go, it is still the second-best selling cookie in the United States today). In a short piece from the March 14, 1931 issue of The New Yorker, author E. B. White, of Charlotte’s Web fame, describes his visit to the headquarters and the democratic, casual process by which anyone could submit suggestions for new products, about half of which the company would actually test. “A baker makes up a trial batch of the new model and sends them upstairs, where they are placed in an open rack by the water cooler,” wrote White. “Employees may help themselves. Everything is informal—there are no charts or tables: after a few days have elapsed the heads of departments simply meet and talk the thing over…As soon as a cookie has passed its tests, it gets a name.” By 1958, Nabisco—like many city residents at the time—left its urban headquarters for the less expensive, more expansive suburbs in Fair Lawn, New Jersey, where they could have the space needed for expanded production. As Andrew Berman, executive director of the Greenwich Village Society for Historic Preservation points out, it was a time when many businesses and people were leaving the area. “Part of what made that area so desirable at the time for industry was that it was connected to rail and piers, so it was a great place for shipping and receiving goods and materials.” But increasingly those deliveries were made by truck instead, which was not as well-suited for the dense urban environment. The ensuing decades were a period of change for that part of Chelsea and the adjacent meatpacking district. “While the meatpacking industry of the district held on for a bit longer, it was slipping into a decline, and the area became known mainly as the home to the raunchiest nightlife in Manhattan,” writes Michael Phillips in the introduction to The Chelsea Market Cookbook. “A late-night trip to the meatpacking district could show some of New York City’s seediest, most violent, or most disreputable scenes, from men in blood-spattered jackets carting meat carcasses, to sex workers plying their trade, often playing out right next to each other.” Jim Casper, professor of sociology at the CUNY Graduate Center and head of the 300 West 15th Street Block Association, which abuts the former Nabisco complex, moved to the neighborhood in 1992 and recalls that “at that time, [the buildings] mostly had sweatshops in it… It was a wondrous thing when Chelsea Market opened in 1997, almost the same time as Chelsea Piers. The neighborhood suddenly attracted tourists.” Indeed, when developer Irwin Cohen paid about $10 million for the foreclosed mortgage debt on the building in 1990, the thought of bringing tourists to that part of town was just a pipe dream. “When I came here, the history of the building: there were three murders in the basement,” Cohen described in a 2005 interview with the Center for an Urban Future. “You couldn’t walk here. It was controlled by prostitutes 24 hours a day.” Chelsea Market opened in 1997 with many of the same anchor stores that remain today like Amy’s Bread, Ronnybrook Dairy, and The Lobster Place. Though today food halls are all the craze—commercial real estate firm Cushman & Wakefield found that in the first nine months of 2016 alone the number of food halls in the United States increased by 31.1 percent—Chelsea Market was ahead of its time. “When it first opened people thought it was a crazy idea to take this hulking old building at the north of the Meatpacking District—not a chic area by any means at the time—and try to turn it into this trendy food hall,” says Berman. “A lot of people scoffed at the idea. And they were wrong. It was tremendously successful—one of many motors for really transforming that area into the destination it is now.” Today Chelsea Market, now an indoor market and food hall frequented by tourists and locals alike, attracts some six million visitors a year. It ushered in an era of transformation and gentrification to the neighborhood as other developers rode on Cohen’s coattails, scrambling to bring in high end stores, restaurants, hotels, and attractions like the adjacent High Line—a 1.45 mile stretch of abandoned elevated train tracks that has been turned into the city’s most visited destination. A report by the NYU Furman Center found that rents in Chelsea more than doubled between 1990 and 2014. Much like its early days, the 1.2 million-square-foot property is still home to a number of bakeries, but Chelsea Market’s roster of tenants also includes restaurants, shops, and offices like Food Network, Major League Baseball, and—as of 2007—Google. The company—which has well over a dozen offices in the U.S. alone—purchased the building across Ninth Avenue in 2010 after being tenants there for four years, and has been leasing more and more space in Chelsea Market as it became available. So it’s no big surprise that they’d take the next step in the relationship. Alphabet/Google claims that little will change with the purchase, as the previous owner, Jamestown Properties, will retain Chelsea Market branding rights and will continue to manage the food hall. “This purchase further solidifies our commitment to New York, and we believe the Manhattan Chelsea Market will continue to be a great home for us and a vital part of the neighborhood and community,” writes David Radcliffe, VP, Real Estate and Workplace Services in a company blog post. He promises “little or no impact to the community and tenants of the building.” One of the main questions up in the air is whether Alphabet will build on top of the existing 11-story structure. After years of battle between locals and Jamestown properties, plans to upzone Chelsea Market (meaning to allow for further development) were passed in 2012, but have yet to be acted upon. Berman and the Greenwich Village Historical Society were among some of the most outspoken critics of the upzoning. “I am concerned that as time goes on the elements of Chelsea market that were supposed to be preserved and protected—including predominantly independently owned food purveyors—is not going to stay that way,” says Berman. “Not because I have any particular suspicions of Google, but as the area continues to change it might be more useful to them. Apparently this is all about their ever expanding need for space, so it would surprise me a great deal if they did not.” Casper, who also fought the upzoning, is slightly more optimistic. “Google generally has tried to be a good neighbor,” he says. “It will be interesting if they do the expansion or not.” Manissa Maharawal, an assistant professor of anthropology at American University and native New Yorker, has a more pessimistic view. "Based on my research and the work of the Anti-Eviction Mapping Project on evictions, housing and gentrification in the Bay Area, Google and the tech industry as a whole has not been a ‘good neighbor,’” she says. “In fact as the [San Francisco] Google bus protests pointed out, their presence in the region has contributed to the housing and affordability crisis, something they have not taken responsibility for or worked with housing groups to mitigate." Whatever the expansion brings, it will be just another step in a long line of innovation, transformation, and gentrification at this part of New York City.
166fd99237aa7a9367942f501558aa40
https://www.smithsonianmag.com/history/fall-rise-fall-pompeii-180955732/
The Fall and Rise and Fall of Pompeii
The Fall and Rise and Fall of Pompeii On a sweltering summer afternoon, Antonio Irlando leads me down the Via dell’Abbondanza, the main thoroughfare in first-century Pompeii. The architect and conservation activist gingerly makes his way over huge, uneven paving stones that once bore the weight of horse-drawn chariots. We pass stone houses richly decorated with interior mosaics and frescoes, and a two-millennial-old snack bar, or Thermopolium, where workmen long ago stopped for lunchtime pick-me-ups of cheese and honey. Abruptly, we reach an orange-mesh barricade. “Vietato L’Ingresso,” the sign says—entry forbidden. It marks the end of the road for visitors to this storied corner of ancient Rome. The Fires of Vesuvius: Pompeii Lost and Found Just down the street lies what Turin’s newspaper La Stampa called Italy’s “shame”: the shattered remains of the Schola Armaturarum Juventus Pompeiani, a Roman gladiators’ headquarters with magnificent paintings depicting a series of Winged Victories—goddesses carrying weapons and shields. Five years ago, following several days of heavy rains, the 2,000-year-old structure collapsed into rubble, generating international headlines and embarrassing the government of then-Prime Minister Silvio Berlusconi. The catastrophe renewed concern about one of the world’s greatest vestiges of antiquity. “I almost had a heart attack,” the site’s archaeological director, Grete Stefani, later confided to me. Since then this entire section of Pompeii has been closed to the public, while a committee appointed by a local judge investigates the cause of the collapse. “It makes me angry to see this,” Irlando, a genial 59-year-old with a mop of graying hair, tells me, peering over the barrier for a better look. Irlando enters the nearby Basilica, ancient Pompeii’s law court and a center of commerce, its lower-level colonnade fairly intact. Irlando points out a stone lintel balanced on a pair of slender Corinthian columns: Black blotches stain the lintel’s underside. “It’s a sign that water has entered into it, and it’s created mold,” he tells me with disgust. A few hundred yards away, at the southern edge of the ruins, we peer past the cordoned-off entrance to another neglected villa, in Latin a domus. The walls sag, the frescoes are fading into a dull blur, and a jungle of chest-high grass and weeds chokes the garden. “This one looks like a war zone,” says Irlando. Since 1748, when a team of Royal Engineers dispatched by the King of Naples began the first systematic excavation of the ruins, archaeologists, scholars and ordinary tourists have crowded Pompeii’s cobblestone streets for glimpses of quotidian Roman life cut off in medias res, when the eruption of Mount Vesuvius suffocated and crushed thousands of unlucky souls. From the amphitheater where gladiators engaged in lethal combat, to the brothel decorated with frescoes of couples in erotic poses, Pompeii offers unparalleled glimpses of a distant time. “Many disasters have befallen the world, but few have brought posterity so much joy,” Goethe wrote after touring Pompeii in the 1780s. And Pompeii continues to amaze with fresh revelations. A team of archaeologists recently studied the latrines and drains of several houses in the city in an effort to investigate the dietary habits of the Roman empire. Middle- and lower-class residents, they found, had a simple yet healthy diet that included lentils, fish and olives. The wealthy favored fattier fare, such as suckling pig, and dined on delicacies including sea urchins and, apparently, a giraffe—although DNA evidence is currently being tested. “What makes Pompeii special,” says Michael MacKinnon of the University of Winnipeg, one of the researchers, “is that its archaeological wealth encourages us to reanimate this city.” But the Pompeii experience has lately become less transporting. Pompeii has suffered devastating losses since the Schola Armaturarum collapsed in 2010. Every year since then has witnessed additional damage. As recently as February, portions of a garden wall at the villa known as the Casa di Severus gave way after heavy rains. Many other dwellings are disasters in the making, propped up by wooden struts or steel supports. Closed-off roads have been colonized by moss and grass, shrubs sprout from cracks in marble pedestals, stray dogs snarl at passing visitors. A 2011 Unesco report about the problems cited everything from “inappropriate restoration methods and a general lack of qualified staff” to an inefficient drainage system that “gradually degrades both the structural condition of the buildings as well as their decor.” Pompeii has also been plagued by mismanagement and corruption. The grounds are littered with ungainly construction projects that squandered millions of euros but were never completed or used.In 2012, Irlando discovered that an emergency fund set up by the Italian government in 2008 to shore up ancient buildings was instead spent on inflated construction contracts, lights, dressing rooms, a sound system and a stage at Pompeii’s ancient theater. Rather than creating a state-of-the-art concert venue, as officials claimed, the work actually harmed the historic integrity of the site. Irlando’s investigation led to government charges of “abuse of office” against Marcello Fiori, a special commissioner given carte-blanche power by Berlusconi to administer the funds. Fiori is accused of having misspent €8 million ($9 million) on the amphitheater project. In March, Italian authorities seized nearly €6 million ($7 million) in assets from Fiori. He has denied the accusations. Caccavo, the Salerno-based construction firm that obtained the emergency-fund contracts, allegedly overcharged the state on everything from gasoline to fire-prevention materials. Its director was placed under house arrest. Pompeii’s director of restoration, Luigi D’Amora, was arrested. Eight individuals are facing prosecution for charges including misallocation of public funding in connection with the scandal. “This was a truffa, a scam,” says Irlando, pointing out a trailer behind the stage where the police have stored theatrical equipment as evidence of corruption. “It was all completely useless.” Administrative malpractice is not unheard of in Italy, of course. But because of the historic importance and popular appeal of Pompeii, the negligence and decay in evidence there are beyond the pale. “In Italy, we have the greatest collection of treasures in the world, but we don’t know how to manage them,” says Claudio D’Alessio, the former mayor of the modern city of Pompei, founded in 1891 and located a few miles from the ruins. A recent editorial in Milan’s Corriere della Sera declared that Pompeii’s disastrous state was “the symbol of all the sloppiness and inefficiencies of a country that has lost its good sense and has not managed to recover it.” For its part, Unesco issued an ultimatum in June 2013: If preservation and restoration efforts “fail to deliver substantial progress in the next two years,” the organization declared, Pompeii could be placed on the List of World Heritage in Danger, a designation recently applied to besieged ancient treasures such as Aleppo and the Old City of Damascus in Syria. ********** Pompeii’s troubles have come to light at the very moment that its twin city in first-century tragedy—Herculaneum—is being celebrated for an amazing turnaround. As recently as 2002, archaeologists meeting in Rome said Herculaneum was the “worst example of archaeological conservation in a non-war torn country.” But since then, a private-public partnership, the Herculaneum Conservation Project, established by the American philanthropist David W. Packard, has taken charge of the ancient Roman resort town by the Bay of Naples and restored a semblance of its former grandeur. In 2012, Unesco’s director general praised Herculaneum as a model “whose best practices surely can be replicated in other similar vast archaeological areas across the world” (not to mention down the road at Pompeii). Herculaneum’s progress made news just a few months ago, when researchers at the National Research Council in Naples announced a solution to one of archaeology’s greatest challenges: reading the texts of papyrus scrolls cooked at Herculaneum by the fiery pyroclastic flow. Scientists had employed every imaginable tactic to unlock the secrets of the scrolls—prying them apart with unrolling machines, soaking them in chemicals—but the writing, inscribed in carbon-based ink and indistinguishable from the carbonized papyrus fibers, remained unreadable. And unspooling the papyrus caused further damage to the fragile material. The researchers, headed by physicist Vito Mocella, applied a state-of-the-art method, X-ray phase-contrast tomography, to examine the writing without harming the papyrus. At the European Synchrotron Radiation Facility in Grenoble, France, high-energy beams bombarded the scrolls and, by distinguishing contrasts between the slightly raised inked letters and the surface of the papyrus, enabled scientists to identify words, written in Greek. It marked the beginnings of an effort that Mocella calls “a revolution for papyrologists.” ********** It was on the afternoon of August 24, A.D. 79, that people living around long-dormant Mount Vesuvius watched in awe as flames shot suddenly from the 4,000-foot volcano, followed by a huge black cloud. “It rose to a great height on a sort of trunk and then split off into branches, I imagine because it was thrust upwards by the first blast and then left unsupported as the pressure subsided,” wrote Pliny the Younger, who, in a letter to his friend, the historian Tacitus, recorded the events he witnessed from Misenum on the northern arm of the Bay of Naples, about 19 miles west of Vesuvius. “Sometimes it looked white, sometimes blotched and dirty, according to the amount of soil and ashes it carried with it.” Volcanologists estimate that the eruptive column was expelled from the cone with such force that it rose as high as 20 miles. Soon a rain of soft pumice, or lapilli, and ash began falling over the countryside. That evening, Pliny observed, “on Mount Vesuvius broad sheets of fire and leaping flames blazed at several points, their bright glare emphasized by the darkness of night.” Many people fled as soon as they saw the eruption. But the lapilli gathered deadly force, the weight collapsing roofs and crushing stragglers as they sought protection beneath staircases and under beds. Others choked to death on thickening ash and noxious clouds of sulfurous gas. In Herculaneum, a coastal resort town about one-third Pompeii’s size, located on the western flank of Vesuvius, those who elected to stay behind met a different fate. Shortly after midnight on August 25, the eruption column collapsed, and a turbulent, superheated flood of hot gases and molten rock—a pyroclastic surge—rolled down the slopes of Vesuvius, instantly killing everyone in its path. Pliny the Younger observed the suffocating ash that had engulfed Pompeii as it swept across the bay toward Misenum on the morning of August 25. “The cloud sank down to earth and covered the sea; it had already blotted out Capri and hidden the promontory of Misenum from sight. Then my mother implored, entreated and commanded me to escape as best I could....I refused to save myself without her and grasping her hand forced her to quicken her pace....I looked round; a dense black cloud was coming up behind us, spreading over the earth like a flood.” Mother and son joined a crowd of wailing, shrieking and shouting refugees who fled from the city. “At last the darkness thinned and dispersed into smoke or cloud; then there was genuine daylight....We returned to Misenum...and spent an anxious night alternating between hope and fear.” Mother and son both survived. But the area around Vesuvius was now a wasteland, and Herculaneum and Pompeii lay entombed beneath a congealing layer of volcanic material. ********** The two towns remained largely undisturbed, lost to history, through the rise of Byzantium, the Middle Ages and the Renaissance. In 1738, Maria Amalia Christine, a nobleman’s daughter from Saxony, wed Charles of Bourbon, the King of Naples, and became entranced by classical sculptures displayed in the garden of the royal palace in Naples. A French prince digging in the vicinity of his villa on Mount Vesuvius had discovered the antiquities nearly 30 years earlier, but had never conducted a systematic excavation. So Charles dispatched teams of laborers and engineers equipped with tools and blasting powder to the site of the original dig to hunt more treasures for his queen. For months, they tunneled through 60 feet of rock-hard lava, unearthing painted columns, sculptures of Roman figures draped in togas, the bronze torso of a horse—and a flight of stairs. Not far from the staircase they came to an inscription, “Theatrum Herculanense.” They had uncovered a Roman-era town, Herculaneum. Digging began in Pompeii ten years later. Workers burrowed far more easily through the softer deposits of pumice and ash, unearthing streets, villas, frescoes, mosaics and the remains of the dead. “Stretched out full-length on the floor was a skeleton,” C.W. Ceram writes in Gods, Graves and Scholars: The Story of Archaeology, a definitive account of the excavations, “with gold and silver coins that had rolled out of bony hands still seeking, it seemed, to clutch them fast.” In the 1860s a pioneering Italian archaeologist at Pompeii, Giuseppe Fiorelli, poured liquid plaster into the cavities in the solidified ash created by the decomposing flesh, creating perfect casts of Pompeii’s victims at the moment of their deaths—down to the folds in their togas, the straps of their sandals, their agonized facial expressions. Early visitors on the Grand Tour, like today’s tourists, were thrilled by these morbid tableaux. “How dreadful are the thoughts which such a sight suggests,” mused the English writer Hester Lynch Piozzi, who visited Pompeii in the 1780s. “How horrible the certainty that such a scene might be all acted over again tomorrow; and that, who today are spectators, may become spectacles to travelers of a succeeding century.” ********** Herculaneum remained accessible only by tunnels through the lava until 1927, when teams supervised by Amedeo Maiuri, one of Italy’s pre-eminent archaeologists, managed to expose about a third of the buried city, around 15 acres, and restore as faithfully as possible the original Roman constructions. The major excavations ended in 1958, a few years before Maiuri’s retirement in 1961. I’m standing on a platform suspended above Herculaneum’s ancient beachfront, staring down at a grisly scene. Inside stone archways that framed the entrance to a series of boat houses, 300 skeletons huddle, frozen for eternity in positions they had assumed at the moment of their deaths. Some sit propped against stones, others lie flat on their backs. Children nestle between adults; a few loners sit by themselves. “They didn’t know what was going to happen to them. Maybe they were all waiting for rescue,” says Giuseppe Farella, a conservator. Instead, they were overcome by a 1,000-degree Fahrenheit avalanche of gas, mud and lava, which burned the flesh off their bones, then buried them. “It must have been very painful, but very fast,” says Farella. The exhibit, which opened in 2013, is among the latest initiatives of the Herculaneum Conservation Project, supported by the Packard Humanities Institute in Los Altos, California (founded by David W. Packard, an heir to the Hewlett-Packard fortune), in partnership with the British School at Rome, and the Superintendency for the Archaeological Heritage of Naples and Pompeii, the government body that administers the site. Since the project’s founding in 2001, it has spent €25 million ($28.5 million) on initiatives that have revitalized these once-collapsing ruins. The project began to take shape one evening in 2000, when Packard (who declined to be interviewed for this article) considered ideas for a new philanthropic endeavor with his friend and renowned classics scholar Andrew Wallace-Hadrill, then director of the British School at Rome. Hadrill recommended Herculaneum. “The superintendent showed [Packard] around the site; two-thirds was closed to the public because it was falling down,” Sarah Court, the project’s press director, tells me in a trailer beside the ruins. “Mosaics were crumbling, frescoes were falling off walls. Roofs were collapsing. It was a disaster.” Herculaneum, of course, faced cronyism and financial shortages that Pompeii has today. But Packard staffers took advantage of private money to hire new specialists. One of the site’s biggest problems, lead architect Paola Pesaresi tells me as we walked the grounds, was water. The ancient city sits some 60 feet below the modern city of Herculaneum, and rain and groundwater tends to collect in pools, weakening foundations and destroying mosaics and frescoes. “We had to find a delicate way to prevent all this water from coming in,” she says. The project hired engineers to resurrect the Roman-era sewage system—tunnels burrowed three to six feet beneath the ancient city—two-thirds of which had already been exposed by Maiuri. They also installed temporary networks of aboveground and underground drainpipes. Pesaresi ushers me through a tunnel chiseled through the lava at the entrance to the ruins. Our conversation is nearly drowned out by a torrent of water being pumped from beneath Herculaneum into the Bay of Naples. We stroll down the Decumanus Maximus, a street where public access has long been quite limited, because of the danger of falling stones and collapsing roofs. After millions of dollars of work, the facades are secure and the houses are dry; the street fully opened in 2011. Workers have painstakingly restored several two-story stone houses, piecing together original lintels of carbonized wood—sealed for 2,000 years in their oxygen-less tomb—along with terra-cotta-and-wood roofs, richly frescoed walls, mosaic floors, beamed ceilings and soaring atriums. Pesaresi leads me into the Casa del Bel Cortile, a recently renovated, two-story home with an open skylight, a mosaic-tiled floor and a restored roof protecting delicate murals of winged deities posed against fluted columns. Unlike Pompeii, this villa, as well as numerous others in Herculaneum, conveys a sense of completeness. Art restorers are stripping away layers of paraffin that restorers applied between the 1930s and 1970s to prevent paint from cracking on the city’s magnificent interior frescoes. “The early restorers saw that the figurative scenes were flaking, and they asked themselves, ‘What can we do?’” Emily MacDonald-Korth, then of the Getty Conservation Institute, tells me during a lunch break inside a two-story villa on the Decumanus Maximus. The wax initially worked as a kind of glue, holding the images together, but ultimately speeded the frescoes’ disintegration. “The wax bonded with the paint, and when water trapped behind the walls sought a way of coming out, it pushed the paint off the walls,” she explains. For some years, the Getty Institute has experimented with laser techniques to restore frescoes, employing a noninvasive approach that strips away wax but leaves paint untouched. Now the Getty team has applied that technique at Herculaneum. “We’re doing this in a controlled way. It won’t burn a hole through the wall,” MacDonald-Korth says. In 1982, the site’s then director, Giuseppe Maggi, uncovered the volcanic sands of buried Herculaneum’s ancient seafront, as well as a 30-foot-long wooden boat, hurled ashore during the eruption by a seismic tremor-created tsunami. It was Maggi who uncovered the 300 victims of Vesuvius, along with their belongings, including amulets, torches and money. One skeleton, nicknamed “the Ring Lady,” was bedecked in gold bracelets and earrings; her rings were still on her fingers. A soldier wore a belt and a sword in its sheath, and carried a bag filled with chisels, hammers and two gold coins. Several victims were found carrying house keys, as if fully expecting to return home once the volcanic eruption had passed. Though excavation work began in the 1980s, forensics experts more recently photographed the skeletons, made fiberglass duplicates in a lab in Turin and, in 2011, placed them in the identical positions as the original remains. Walkways allow the public to view the reproduced skeletons. Today, with restoration virtually completed and new landscaping installed, tourists can walk along the sand just as residents of Herculaneum would have done. They can also relive to a remarkable degree the experience of Roman visitors who arrived by sea. “If you were here 2,000 years ago, you would approach by boat and pull up on a beach,” says conservator Farella, leading me along a ramp past the arches opening to the skeletons. In front of us, a steep set of stairs breaches the outer walls of Herculaneum and takes us into the heart of the Roman city. Farella leads me past a bath complex and gymnasium—“to smarten yourself up before you come into town”—and a sacred area where departing travelers sought protection before venturing back to sea. Farther along stands the Villa of the Papyri, believed to be the home of Julius Caesar’s father-in-law. (The villa housed the scrolls now being deciphered by researchers.) It is closed to the public, but plans are underway for a renovation, a project that Farella says “is the next great challenge” at Herculaneum. He leads me into the Suburban Baths, a series of interconnected chambers filled with huge marble tubs, carved stone benches, tiled floors, frescoes and friezes of Roman soldiers, and a furnace and pipe system that heated the water. Solidified lava, frozen for 2,000 years, pushes up against the doors and windows of the complex. “The bath building was filled with pyroclastic material; excavators chipped it all away,” the conservator says. We pass through the colonnaded entryway of a steam room, down steps leading into a perfectly preserved bathtub. Thick marble walls have sealed in moisture, replicating the atmosphere that Roman bathers experienced. Yet, as if to underscore the reality that even Herculaneum has its troubles, I’m told that parts of this ghostly former center of Roman social life have opened to the public only intermittently, and it is closed now: There’s simply not enough staff to guard it. ********** In Pompeii, another eight stops along the Circumvesuviana Line, the train that carries thousands of visitors to the site every day, past graffiti-covered stations and scruffy exurbs, the staff is eager to present an impression of new dynamism. In 2012 the European Union gave the go-ahead for its own version of a Herculaneum-style initiative: the Great Pompeii Project, a €105 million ($117.8 million) fund intended to rescue the site. Mattia Buondonno, Pompeii’s chief guide, a 40-year veteran who has escorted notables including Bill Clinton, Meryl Streep, Roman Polanski and Robert Harris (who was researching his best-selling thriller Pompeii), pushes through a tourist horde at the main entrance gate and leads me across the Forum, the marvelously preserved administrative and commercial center of the city. I wander through one of the most glorious of Pompeii’s villas, the House of the Golden Cupids, a wealthy man’s residence, its interior embellished with frescoes and mosaics, built around a garden faithfully reproduced on the basis of period paintings. Fully restored with funding from the Italian government and the EU, the house was to open the week after my visit, after being closed for several years. “We needed money from the EU, and we needed architects and engineers. We could not realize this by ourselves,” says Grete Stefani, Pompeii’s archaeological director. I also paid a visit to the Villa dei Misteri, which was undergoing an ambitious renovation. After decades of ill-conceived cleaning attempts—agents that were used included waxes and gasoline—the villa’s murals, depicting scenes from Roman mythology and everyday life in Pompeii, had darkened and become indecipherable. Project director Stefano Vanacore surveyed the work-in-progress. In an 8-by-8-foot chamber covered with frescoes, two contractors wearing hard hats were dabbing the paintings with outsize cotton swabs, dissolving wax. “This stuff has been building up for more than 50 years,” one of the workers told me. In a large salon next door, others were using laser tools to melt away wax and gasoline buildup. Golden sparks shot off the bearded face of the Roman god Bacchus as the grime dissolved; beside him, a newly revealed Pan played his flute, and gods and goddesses caroused and banqueted. “It’s beginning to look the way it did before the eruption,” Vanacore said. A wall panel across the room presented a study in contrasts: The untouched half was shrouded in dust, with bleached-out red pigments and smudged faces; the other half dazzled with figures swathed in fabrics of gold, green and orange, their faces exquisitely detailed, against a backdrop of white columns. I asked Vanacore how the frescoes had been allowed to deteriorate so markedly. “It’s a complicated question,” he said with an uneasy laugh, allowing that it came down to “missing the daily maintenance.” The Villa dei Misteri, which reopened in March, may be the most impressive evidence to date of a turnaround at Pompeii. A recent Unesco report noted that renovation work was progressing on 9 of the 13 houses identified as being at risk in 2013. The achievements of the Great Pompeii Project, along with the site’s routine maintenance program, so impressed Unesco that the organization declared that “there is no longer any question of placing the property on the World Heritage in Danger list.” Still, despite such triumphs, Pompeii’s recent history of graft, squandered funds and negligence has many observers questioning whether the EU-financed project can make a difference. Some Italian Parliamentarians and other critics contend that Pompeii’s ruins should be taken over in a public-private initiative, as at Herculaneum. Even the Unesco report sounded a cautious note, observing that “the excellent progress being made is the result of ad hoc arrangements and special funding. The underlying cause of decay and collapse...will remain after the end of the [Great Pompeii Project], as will the impacts of heavy visitation to the property.” ********** To Antonio Irlando, the architect who is Pompeii’s self-appointed watchdog, the only solution to saving Pompeii will be constant vigilance, something that the site’s managers and the Italian government have never been known for. “Italy was once leading the world in heritage conservation,” he says. Squandering Unesco’s good will would be, he declares, “a national shame.” Joshua Hammer is a contributing writer to Smithsonian magazine and the author of several books, including The Bad-Ass Librarians of Timbuktu: And Their Race to Save the World's Most Precious Manuscripts and The Falcon Thief: A True Tale of Adventure, Treachery, and the Hunt for the Perfect Bird.
df3cff3538d9b28fffa8ec4eff1a707b
https://www.smithsonianmag.com/history/fanciful-and-sublime-74131278/
Fanciful and Sublime
Fanciful and Sublime Quiet can be a blessing, but unnatural silence is something else again. In the storeroom of the National Museum of American History where we keep a portion of the Smithsonian’s vast musical instruments collection, the stillness goes against the grain. Though all the objects in the room were made for noise and use, they’ve been tamed by the discipline of a museum. Trumpets, oboes, flutes and harmonicas lie like specimens in drawers, as bugs and birds do in other great collections of the Institution. Violins, guitars, banjos and fat horns sit in cabinets. Cellos in their cases rest against the walls. Not a sound from the lot, and yet the mind can’t help but hear each one. There are perhaps ten exceptional collections of musical instruments in Europe and the United States, and the Smithsonian’s is among the very best of them. It comprises some 5,000 objects under the care of the Division of Cultural History in the American History Museum (not because the instruments are all American in character, but because the museum was originally a museum of history and technology) and a like number of instruments housed, because of their ethnographic character, in the National Museum of Natural History. The part of the collection we have the space to exhibit publicly at any one time can only hint at what’s behind the scenes. A display of keyboard instruments in a gallery of the American History Museum, for example, includes one of three surviving harpsichords by the 18th-century master Benoist Stehlin; a piano of the smallish sort for which Beethoven wrote his first two piano concertos; the immense Steinway grand from 1903 that was number 100,000 manufactured by the company; and a contemporary Yamaha acoustic and digital piano of aluminum and Plexiglas, with a control panel that might have come from a recording studio. Each is a marvel, and we could multiply them by another gallery or two. Some of the items in the Smithsonian’s collection are astonishingly beautiful (stringed instruments by the Italian master Antonio Stradivari); some are barely functional (an impossibly heavy banjo made from a World War I German artillery shell, with bullet casings for tuning pegs); and many are wayward and fanciful (a peanut-shaped harmonica with a Jimmy Carter smile). Of course, human whimsy can run headfirst into a wall of natural selection: there was to be no future for a piano fitted with bells, drums and a bassoon stop, or a violin with what appears to be a gramophone horn attached (to amplify and direct the sound). The greatest treasures of the collection are neither out of sight nor only for silent display. These are the stringed instruments by Stradivari (1644-1737), who could put the geometry of a barely discernible curve in wood to heavenly purpose. Stradivari never heard a string quartet—the format emerged after his death—which perhaps helps explain why he made hundreds of violins and so few violas (only 13 still exist) and cellos (63 exist). Of the estimated 1,100 instruments Stradivari made, only 11 survivors feature ornamentation, with black lacquer tracings and ivory inlays. Four of those—a quartet of such exquisite physical beauty that they qualify as sculpted art—are in our collection, the gift of retired publisher Dr. Herbert Axelrod. Dr. Axelrod’s generosity has also brought us a superb set of instruments by Stradivari’s teacher, Nicolò Amati. We’re now renovating a gallery in the American History Museum in which all these rare and beautiful objects (and other prized examples of the luthier’s art) will be on display in 2003. On display, that is, when they’re not at work. For the instruments are never shown to greater advantage, or kept in better health, than when they’re played. Former Smithsonian Secretary S. Dillon Ripley laid down the law about that: "Let the instruments sing!" On recordings and in the many chamber concerts sponsored by our music programs, the most spectacular of the instruments do just that. And when they sing, as they have for centuries, time is erased, differences are eased, and there is harmony across the ages. Lawrence M. Small was the eleventh secretary of the Smithsonian Institution, serving from 2000 to 2007.
8c08b1d37559b3de70007c40e1f244c0
https://www.smithsonianmag.com/history/father-reginald-foster-used-latin-bring-history-present-180976657/
Father Reginald Foster Used Latin to Bring History Into the Present
Father Reginald Foster Used Latin to Bring History Into the Present The death of Latin has been greatly exaggerated. Of course, Latin is no longer the default language for European learning and diplomacy, as it was from the Roman Empire through the early modern period. Since the implementation of Vatican II in the early 1960s, even many priests don't speak the language in a meaningful way. Still, despite Latin's decline in political and ecclesiastical circles, hundreds of folks around the globe continue to speak it as a living language—and no teacher is more responsible for the world's remaining crop of latineloquentes (“Latin speakers”) than Friar Reginald Foster, the Carmelite monk who served as Latin secretary to four popes from 1969 until 2009, translating diplomatic papers and papal encyclicals into Latin, which remains the official language of the Holy See. Foster died on Christmas Day, at the age of 81. In 2007, Foster himself lamented to the BBC that he thought the language was on its way out altogether. He worried that a modern world, illiterate in Latin, would lose contact with crucial portions of history, and half-jokingly recommended that then-Pope Benedict XVI replace Italy's traditional siesta with a two-hour daily Latin reading. The Pope never took up Foster's suggestion, but the irony is that Foster had already managed, almost single-handedly, to reverse some of the trends that so troubled him. His deepest passion was teaching Latin at the Pontifical Gregorian University in Rome, starting in 1977, and running his famous spoken Latin course nearly every summer, beginning in 1985. Through these courses, Foster launched multiple generations of classicists who have used his techniques to bring their students into closer contact with a past that, until recently, had seemed to be vanishing. Foster is well remembered for his boisterous, generous presence in the classroom and on field trips. He was beloved among students, and distrusted by Vatican grandees, for his eccentric habits, which included dressing in a blue plumber's suit and issuing caustic statements about church hypocrisy. When he was teaching—in Rome until 2009, thereafter in Wisconsin—he often nursed a glass of wine. Known by the Latin sobriquet "Reginaldus" to his legions of pupils, who in turn refer to themselves as "Reginaldians," Foster was a hero and a jester, a pug-nosed provocateur with a satirical streak who would have fit right into a comic epistle by Horace or Erasmus. "Like Socrates, his default mode in public was ironic," says Michael Fontaine, an administrator and professor of Classics at Cornell University. Fontaine, who first met Foster in the spring of 1997, makes no bones about the extent of Foster's legacy. "Reginald Foster succeeded in reversing the decline in living Latin. He actually, really, genuinely did it. Reggie's success is total: There is a burgeoning movement and critical mass of young people who have now learned Latin [as a spoken language]. Reggie taught some, his students taught some, those people are teaching some, and on and on. Some of the best Latinists in the world are in their 20s or early 30s"—a remarkable development that Fontaine credits squarely to Foster’s peerless influence. Leah Whittington, an English professor at Harvard University, who first met Foster during a summer Latin course in 1997 when she was 17, recalls the friar's "phenomenal, ebullient energy." "He never sat down, never seemed to need rest or eat or sleep," Whittington says. "It was as though he was fueled from within by love for Latin, love for his work, love for his students. I had never been pushed so hard by a teacher." Like all of Foster's students who spoke with Smithsonian, Whittington recalls his visionary dedication to preserving Latin by keeping it alive in everyday conversation. "For most classicists trained in the United States or in Great Britain, Latin was a learned, non-spoken language; it was not a language that one could converse in, like French or Spanish. But for Reginald, Latin was an everyday functional language that he used with his friends, his teachers, his colleagues, with himself and even in his dreams." Foster went to extraordinary lengths to make sure he was keeping his students as engaged as possible with their work outside the classroom, which the friar referred to not as homework but as ludi domestici—"games to play at home." This playful approach often proved a revelation to students used to more staid ways of teaching a language they'd been told was dead. "It's so rare to have an immersion experience in Latin that it couldn't fail to improve and deepen your knowledge of the language and history,” says Scott Ettinger, a Latin and Greek teacher in the Bronx, who attended Foster's summer course in 1996. Daniel Gallagher, who in 2009 succeeded Foster in the Latin section of the Vatican Secretariat and today teaches the language at Cornell University, still marvels at Foster's "extreme dedication to his students." "He told us, 'Call me at 2 in the morning if you're stuck,'" says Gallagher, who began studying with Foster in October 1995. "He said, 'I'll even come to your house to teach you Latin.' And I learned that he wasn't kidding—he really would come to my house." Classicist Jason Pedicone recalls his first course with Foster in 2004: "He made me feel like learning Latin was a key that would unlock endless beauty and wisdom of history, art and literature." "Studying Greek and Latin with Reginald was spiritually enriching,” he says. “I don't mean that in a doctrinal way; it was just really life-affirming and made me stand in awe of humanity and civilization." In 2010, Pedicone co-founded the Paideia Institute with Eric Hewett, another of Foster's students; the organization offers immersive courses in Latin and Greek. Tales of Foster have long been common among anglophone classicists. Even those who never visited him in Rome had often heard something about this eccentric priest who gave free, immersive Latin lessons. "I had heard for some time that there was a priest in Rome who spoke Latin and gave free summer courses where you actually spoke Latin," says Alice Rubinstein, a now-retired Latin teacher living in Virginia. "I remember some woman telling me he was like a priestly version of Don Rickles.” "[Foster] reminds me of the humanists I study in the 15th century, especially Lorenzo Valla," says classicist Chris Celenza, a dean at Johns Hopkins University who took courses with Foster in 1993 and marvels at the friar's unerring ability to bring the past into the present, to make old texts new. "Foster could almost ventriloquize the authors we were studying. He was a living anachronism, and I think he knew it and kind of delighted in that." In his obituary for Foster, John Byron Kuhner, who is writing a biography of the friar, sounded a similar note about Reginaldus' uncanny ability to make ancient writers seem intimate and accessible—a closeness that he fostered in his students: "The writers and artists of the past seemed to be equally [Foster's] friends. He loved them in a way we could see, the way we love our living friends who happen to be far away." Foster's famous summer Latin course was full of day trips. Traditional jaunts included the site in Formia where Cicero was assassinated by Mark Antony's men in 43 B.C. ("Reginald would weep while reciting Cicero's epitaph," Whittington recalls); the gardens at Castel Gandolfo, the Pope's summer residence, where students sang Latin songs to "papal bulls"—that is, cows grazing outside the Pope's house; to the port town of Ostia; Pompeii and Naples; the spot at Largo Argentina in Rome where Julius Caesar was assassinated; the castle in Latium where Thomas Aquinas was born. "Walking with Reggie through these Italian sites made Rome come alive in a way that it couldn't have without someone of his encyclopedic knowledge of Latin," says Alexander Stille, a journalism professor at Columbia University, who profiled Foster for the American Scholar in 1994. "Foster used to tell us that 'Reading Augustine in translation is like listening to Mozart on a jukebox,'" Stille says, "and that being in Rome without access to Latin was to see an impoverished version of it. He made the city come alive." There are many classicists (I am one of them) who never met Foster but who benefited from his teachings by studying under his protégés, many of whom use techniques pioneered by Foster. "When I led student trips to Italy, I modeled them on the field trips Foster used to take with us," says Helen Schultz, now a Latin teacher at a private school in New Hampshire. "On one memorable occasion, he joined me and a group of my students to talk about their studies and his work at the Vatican. He didn't just love Latin; he also loved and cared deeply about every one of the students who learned from him and were inspired by him to do our best to keep his legacy alive." Like many of Foster's students, Ada Palmer, a European history professor at the University of Chicago, says the friar opened up a whole world of post-Classical Latin literature for his charges. Rather than falling back on the typical, and almost entirely ancient, canon taught in most classrooms, he introduced scholars to the Latin of St. Jerome's autobiography, or medieval bestiaries, or Renaissance books of magic, or rollicking pub songs from the 17th and 18th centuries, Palmer says, and thereby widened the possibilities for Latin studies across the world. "Reggie's enthusiasm was for all Latin equally," Palmer says, "and he encouraged us to explore the whole vast, tangled and beautiful garden of Latin, and not just the few showpiece roses at its center. He trained scholars who have revolutionized many fields of history and literary studies." Celenza agrees, referring to the millions of pages of Latin from the Renaissance onward as "a lost continent" that Foster played a central role in rediscovering. Foster was famous for many of his one-liners, perhaps none more so than his frequent reminder to students that "Every bum and prostitute in ancient Rome spoke Latin." (In one variant on this line, "dog-catcher" takes the place of "bum.") His point was that one needn't be an elite to appreciate the riches of a language that began, after all, as a vernacular. But Foster's interest in bums and prostitutes was not merely rhetorical. "He did a lot of good for the prostitutes of Rome," Ettinger says. Foster was known for giving what little money he had to the city's downtrodden, even though, by keeping his classes free, he ensured that he had practically no income. (He was also known sometimes to pay a student's rent in Rome for a semester.) "In one's life, if you're lucky, you'll meet a certain number of people who are genuinely extraordinary and who try to change your life in some way. Reggie was one of those people in my life," Stille says. "There were few people on the planet who have the relationship to Latin that he did." In his final weeks, Foster's friends say, he was as boisterous as ever, even after testing positive for Covid-19: He continued working with Daniel P. McCarthy—a Benedictine monk who began studying with Foster in the fall of 1999—on their book series codifying Foster's teaching methods. And he maintained lively conversations with protégés, often in Latin, via phone and video calls. Today, classicists, philologists and anyone else who wishes they had taken a Latin immersion course with Foster can console themselves with several options offered by his former students. Each summer, you will find Ettinger helping organize the annual Conventiculum aestivum ("summer convention") in Lexington, Kentucky, an 8- to 12-day immersive program that welcomes 40 to 80 attendees a year. Other Foster protégés, including Whittington, Gallagher, Fontaine and Palmer, have taught immersive classes through the Paideia Institute. Foster may be gone, but his dedication to Latin as a living language, one that puts us in direct conversation with our past, continues to thrive against all odds. Ted Scheinman is a senior editor for Smithsonian magazine. He is the author of Camp Austen: My Life as an Accidental Jane Austen Superfan
004c4e9ea3c065e4153162d38693f029
https://www.smithsonianmag.com/history/fifty-years-ago-north-korea-captured-american-ship-and-nearly-started-nuclear-war-180967919/
Fifty Years Ago, North Korea Captured an American Ship and Nearly Started a Nuclear War
Fifty Years Ago, North Korea Captured an American Ship and Nearly Started a Nuclear War From missiles flying over Japan to threats of fiery destruction for Guam, North Korea spent much of 2017 provoking its East Asian neighbors—and the United States. While the recent missile tests and threats seem alarming, it’s hardly the first time the U.S. and North Korea have danced on the edge of total war. The secretive nation, currently ruled by dictator Kim Jong-Un, has long engaged in belligerent and sometimes violent behavior to deter other nations from attacking, and to prove the government’s legitimacy to its people. Perhaps no incident is more demonstrative of the risks North Korea is willing to take than the capture of the USS Pueblo that occurred 50 years ago, on January 23, 1968, at the height of the Cold War. Since the end of the Korean War in 1953, when North Korea and South Korea signed an armistice agreement, the United States has played a direct role in protecting the southern, democratic nation. Throughout the 1950s, as North Korea began rebuilding itself (the U.S. dropped 635,000 tons of explosives on the north during the war, more than the entire amount used over all of the Pacific Theater in World War II), the United States did all that it could to ensure a stable, non-communist government remained in power in South Korea. The U.S. even went so far as to station atomic weapons in the south in 1958, violating the rules of the armistice agreement. The Vietnam War brought an increasing number of American troops to the region, making the United States and South Korea even closer allies. The growing American involvement in Vietnam led North Korean leaders to believe their country might be next, and the number of violent attacks on the South Korean and American militaries rose accordingly. While only 32 incidents occurred on the heavily fortified DMZ (the border between North and South Korea) in 1964, that number rose to around 500 by 1967. By 1968, the North Korean regime was ready to launch an even more audacious assault—an assassination attempt on South Korean president Park Chung-hee in the presidential mansion, known as the Blue House. A 31-man commando team infiltrated South Korea on January 21 of that year, but were detected before they came close to Park, and all but two men were killed in the ensuing firefight. Just two days later, North Korean torpedo boats and submarine chasers successfully surrounded and captured the USS Pueblo, a Navy intelligence ship patrolling international waters that had few weapons with which to defend itself. Of the ship’s 83 crew members, one was killed in the attack, while the rest were taken as prisoners. What could have motivated the small, relatively powerless country to commit such a blatant crime? “There is no evidence that the assassination attempt and the Pueblo incident were related, but the North’s objectives in seizing the ship may have been to detach the United States from the South by forcing the former to negotiate directly with the [North Korean government] to release the prisoners,” writes historian Steven Lee in The Journal of Korean Studies. In other words, maybe attacking the ship was meant to create a diplomatic divide between the U.S. and South Korea, requiring the United States to open up more direct dialogue with North Korea, which might anger the government in South Korea. There’s also the theory put forward by the CIA in 1969: that the capture of the Pueblo was a way to ensure the South Korean government couldn’t retaliate against North Korea for the Blue House assassination attempt without causing all-out war. “There are also some who have suggested that the North Korea failure to assassinate the president [in South Korea] was an embarrassment to the North Korean government, so they captured the ship as a way to distract people,” says Mitchell Lerner, professor of history at Ohio State University and the author of The Pueblo Incident: A Spy Ship and the Failure of American Foreign Policy. “I don’t actually think there’s much to that. They were very different operations.” In Lerner’s opinion, the main motivation for capturing the Pueblo was domestic propaganda. “This was a way to demonstrate their strength, their power, that they had forced the might United States to capitulate,” Lerner says. At the time of its capture, the North Koreans didn’t know the ship carried any surveillance technology or sensitive documents. From outside appearances, it seemed to be no more than an old cargo carrier—an easy target. Condemnation of the attack came rapidly from the United Nations and all of America’s allies—and, secretly, in the following years, from Communist powers like the Soviet Union and China. Though President Lyndon B. Johnson didn’t want to launch a second war in Asia, he seriously considered military retaliation. In a deployment operation named “Combat Fox,” Johnson sent B-52 bombers and aerial refueling craft to Okinawa and Guam, 200 F-4 fighter jets to the Korean peninsula, and three nuclear aircraft carriers to the sea between Japan and the Korea, Lee writes. Meanwhile, Secretary of Defense Robert McNamara told Johnson, “The great danger that we must avoid is that the Soviets and the North Vietnamese will interpret something that we do as a sign of weakness.” But the cost of war would’ve been far too high, and “military action designed to extricate the [imprisoned] sailors would likely only see them killed,” Lee writes. Plus, there was the Vietnam War to consider. South Korea was one of America’s best allies in the war, contributing tens of thousands of soldiers to the fight. “In the wake of the Blue House raid and the Pueblo incident, they were starting to make noise that they might withdraw their forces [from Vietnam] and launch an attack on North Korea, and the United States certainly didn’t want that,” Lerner says. So the U.S. began providing even more domestic assistance to South Korea in return for their continued aid in Vietnam, while also negotiating with North Korea for the release of the 82 men from the Pueblo, who were regularly tortured throughout their detention and eventually forced to sign documents admitting to unlawful spying. Negotiations moved slowly and were largely unproductive. It wasn’t until December 23, 1968—a full 11 months since the Pueblo was captured—that chief negotiator Major General Gilbert Woodward signed a document apologizing for illegal spying and promising never to do so again, while at the same time verbally repudiating the apology. The 82 crewmen were then released, though the Pueblo stayed in North Korean custody, where it remains to this day. “In every crisis recorded in the archives, it was U.S. restraint that prevented war,” said Van Jackson, political scientist at Victoria University of Wellington, by email. “We found out later—through conversations that North Korean officials had with Soviet counterparts—that North Korea was poised to retaliate against our retaliation had we chosen to do so.” Apart from the value of restraint and diplomacy, Lerner suggests policymakers take another lesson from the Pueblo incident: recognition of North Korea’s independence. “China is not the answer to the North Korea problem,” Lerner says. “In 1968, American policymakers never really accepted the idea that North Korea might have acted alone and for its own internal reasons. Instead they looked for larger conspiracies, whether it was the Soviets or the Chinese. For the last few decades, American politicians have talked about how the answer to North Korea is China, and China can bring them under control. The reality is the relationship between China and North Korea is not nearly as simple as American policymakers seem to think.” But perhaps most important is the fact that starting a war today would likely result in millions of deaths, and could turn into a global conflict. “The lesson we have to take above all else is patience,” Lerner says. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
e41fa66195ceefee7cee4bd33b776a0b
https://www.smithsonianmag.com/history/fifty-years-ago-satchel-paige-brought-negro-leagues-baseballs-hall-fame-180977000/
Fifty Years Ago, Satchel Paige Brought the Negro Leagues to Baseball’s Hall of Fame
Fifty Years Ago, Satchel Paige Brought the Negro Leagues to Baseball’s Hall of Fame Eyewitnesses said that Satchel Paige, one of the best pitchers baseball will ever see, would tell his teammates to sit on the field, so confident that he’d strike out the batter on his own. The right-handed ace’s showmanship was backed up by the remarkable athletic ability on display with his deadly accurate fastball. Over an estimated 2,600 innings pitched, Paige registered more than 200 wins and, impressively, more than 2,100 strikeouts. And those numbers are incomplete—many of his games, having been played in the Negro Leagues, going unrecorded. “Satchel was pitching in a way if, just based on his performance as a pitcher, he would’ve ranked as one of the all-time greats, if not the greatest,” says Larry Tye, author of the 2009 biography Satchel: The Life and Times of an American Legend. For 20 years after he more-or-less hung up his cleats, however, the National Baseball Hall of Fame, where baseball greats from Babe Ruth to Walter Johnson were enshrined, ​ didn’t have room for Paige or any other Negro Leaguers. Because it was a different league, segregated from the majors solely by race, the Hall hadn’t even considered its players eligible for induction. But in 1971, the Cooperstown, New York, institution finally began to recognize the accomplishments of players whose case for greatness rested on their performance in the Negro Leagues, starting with Paige. A native of Mobile, Alabama, Leroy Paige was born in 1906 and grew up with 11 siblings. Given the nickname “Satchel” for a contraption he made for carrying passengers’ bags at a local train station, he found his talent for baseball at a correctional school. At 18, he joined the Mobile Tigers, a black semi-professional team. No stranger to barnstorming—the practice of teams traveling across the country to play exhibition matches—Paige debuted in the Negro Leagues in 1926 for the Chattanooga Black Lookouts. Among the teams he played for were the Birmingham Black Barons, the Baltimore Black Sox, the Pittsburgh Crawfords (surrounded by other legends, including Josh Gibson and Cool Papa Bell), and the Kansas City Monarchs. Paige won four Negro American League pennants with the Monarchs from 1940 to 1946. Paige was far from the only phenom in the Negro Leagues. Gibson was a monumental power hitter; Oscar Charleston played a gritty, all-around game; and Bell was known for his beyond-human speed, just to name a few. But when it came to star quality, Paige possibly surpasses them all. “He’s probably the biggest drawing card in the history of the Negro Leagues,” says Erik Strohl, vice president of exhibitions and collections at the Hall of Fame. Legend surrounds Paige with stories of his remarkable feats, and some of it was even self-produced: He kept track of his own statistics and the numbers he would provide to others were astounding, if not sometimes inconsistent. While a lack of written accounts at many of his pitching performances has created issues of veracity, the confirmed information available still suggests that his accomplishments are befitting of his prestige. “When you say that he is a legend and one of the greatest players of all time, it may seem like an exaggeration,” says Strohl, “and it's hard to quantify and qualify, but I think probably, undoubtedly that was true in terms of the length and swath of his career.” “He had great speed, but tremendous control,” says historian Donald Spivey, author of the 2013 book If You Were Only White: The Life of Leroy “Satchel” Paige. “That was the key to his success,” he adds, which paired with Paige’s ability to identify batters’ weaknesses from their pitching stances. Spivey says that Paige’s prestige was a boon even for his opponents, as crowds would flock to the games where he was pitching. “The man was a tremendous drawing card,” he notes. He earned a reputation for jumping from one team to the next, depending on who offered the most money. “He got away with it because he was so reliable,” says Tye. “He gave you the ability to draw in fans. Not unlike other talented Negro Leaguers of the era, Paige wanted an opportunity with the MLB. Midway through the 1948 season, he got his chance when he signed with the Cleveland Indians. He was certainly an atypical “rookie”, entering the league when he was 42 after more than 20 years of Negro League competition (Jackie Robinson, for comparison, joined the Brooklyn Dodgers in 1947 when he was 28.) Paige managed to make his time count: he won six games amid a tense battle for the American League pennant, and Cleveland went on to take both the pennant and the World Series victory. Though his debut MLB season was successful, he spent just one more year with the Indians in 1949 before joining the St. Louis Browns in 1951. Following a three-year stint with St. Louis, Paige’s career in the MLB appeared over. However, he continued playing baseball in other leagues, and still found a way to make a brief one-game, three-inning appearance with the Kansas City Athletics in 1965 at the age of 59, not giving up a single run. Paige’s time in Major League Baseball was impressive for a player entering the league in their 40s, asserts Phil S. Dixon, author of multiple books about the Negro Leagues. “He also helped those teams because people wanted to see Satchel Paige,” Dixon says. “Not only was he a decent pitcher, he was an amazing draw.” The Negro Leagues were both the stage at which Paige dazzled audiences for years on end, and the mark of a barrier separating him and other black players from baseball’s biggest stage for years. That barrier would, for a time, be perpetuated by the Hall of Fame. Despite the impact that the Negro Leagues had on baseball and American culture, by the 1960s, just two players associated with them had been recognized as Hall of Famers. Robinson was the first black player inducted, in 1962, and seven years later his former teammate Roy Campanella joined him. The two had achieved entry off the merits of their MLB careers, however, whereas icons like Paige and Gibson had either few or no seasons outside the Negro Leagues. To those who played the game, their worthiness was not a matter of debate. On occasions when black squads faced off against their white contemporaries, they won at least as often as not, if not more. In 1934 Paige and star MLB pitcher Dizzy Dean had their barnstorming teams—one black, one white—face off against each other six times in exhibition play. Paige’s crew won four of those six meetings, including a tense 1-0 victory at Chicago’s Wrigley Field after 13 innings. “Their role in the black community was one that said, ‘We can play as good as anybody,’” says Dixon. “‘And there's no reason for us not being in the major leagues, because not only can we play all of those guys, we can beat those guys.” In the prime of Paige’s Negro League career, New York Yankees’ outfielder Joe DiMaggio once described Paige as the “best and fastest” pitcher he’d ever played against. Former Boston Red Sox star Ted Wiliams used part of his Hall of Fame speech in 1966 to mention the exclusion of Paige and other black players “I hope that someday the names of Satchel Paige and Josh Gibson in some way can be added as a symbol of the great Negro players that are not here only because they were not given the chance,” Williams said to the crowd, a speech that Strohl notes occurred amid the civil rights movement. Meanwhile, sportswriters supportive of the cause used their platforms to argue for Negro Leaguers’ presence in the Hall. Members of the Baseball Writers’ Association of America, the body responsible for selecting Hall members, also created a committee in 1969 to advocate for Negro League inductions. MLB commissioner Bowie Kuhn, elected in 1969, publicly welcomed to the idea of putting Negro League players in the Hall of Fame. In his 1987 autobiography Hardball: The Education of a Baseball Commissioner, Kuhn stated that he didn’t buy into the reasons against inducting Negro League players. “I found unpersuasive and unimpressive the argument that the Hall of Fame would be ‘watered down’ if men who had not played in the majors were admitted,” Kuhn wrote, looking back at the time. “Through no fault of their own,” he added, “the black players had been barred from the majors until 1947. Had they not been barred, there would have been great major-league players, and certainly Hall of Famers, among them.” With Kuhn’s help, the Hall formed their Negro leagues committee in 1971, comprised of several men including Campanella and black sportswriters Sam Lacy and Wendell Smith. They were tasked with considering the merits of past players and executives for inclusion, and they announced Paige was their inaugural nominee in February. Nevertheless, the Hall ran into controversy in how they planned to honor the Negro Leaguers: with a separate section, apart from the Major League inductees. Among the reasons cited were that some of the proposed inductees would not meet the minimum of ten MLB seasons competed in like other honorees. Instead of appearing like a tribute, the move was viewed by many as another form of segregation. “Technically, you'd have to say he's not in the Hall of Fame,” said Kuhn at the time, according to the New York Times. “But I've often said the Hall of Fame isn't a building but a state of mind. The important thing is how the public views Satchel Paige, and I know how I view him.” Backlash to the idea, from sportswriters and fans alike, was plentiful. Wells Trombly, writing for the Sporting News, declared, “Jim Crow still lives. … So they will be set aside in a separate wing. Just as they were when they played. It is an outright farce.” New York Post sports columnist Milton Gross rejected Kuhn’s rosy interpretation, writing, “The Hall of Fame is not a state of mind. It is something semi-officially connected with organized baseball that is run by outdated rules which, as Jackie Robinson said the other day, ‘can be changed like laws are changed if they are unjust.’” With the backdrop of backlash and an upcoming election, the Hall changed their mind in July of that year. The pitcher himself stated that he was not worried where his tribute would be stored. “As far as I am concerned, I’m in the Hall of Fame,” he said. “I don’t know nothing about no Negro section. I’m proud to be in it. Wherever they put me is alright with me.” Tye argues that it was still a painful experience for Paige. “Satchel had dealt with so much affront that I think he took it with quite a bit of class when they offered to let him into the segregated Hall,” he says. “But it clearly was devastating to him.” A player whose name drew crowds and whose performances dazzled them, Paige was inducted into the National Baseball Hall of Fame in August 1971. A statue of Paige now adorns the Hall of Fame’s courtyard. It was installed in 2006, which is also the most recent year any Negro Leaguer has been inducted into the Hall. He is portrayed with his left leg up in the air. His right hand nestles the baseball. Eyes closed, Satchel is preparing a pitch for eternity. “I am the proudest man on the earth today, and my wife and sister and sister-in-law and my son all feel the same,” said Paige at the end of his Hall of Fame acceptance speech, reported the New York Times. “It's a wonderful day and one man who appreciates it is Leroy Satchel Paige.” Jacob Muñoz is an editorial intern for Smithsonian magazine.
80f8fcf4f12a3f11c2ce9a89880f5e9c
https://www.smithsonianmag.com/history/first-children-who-led-sad-lives-180958099/?no-ist
The First Children Who Led Sad Lives
The First Children Who Led Sad Lives In recent decades, most First Children have led charmed lives. Doted on by an adoring public, they have enjoyed opportunities that are rarely available to other Americans. Chelsea Clinton and Jenna Bush, for example, both parlayed their celebrity into cushy contracts with NBC News.  Clinton said to People magazine recently that she sees it as her duty to make sure that her daughter, Charlotte, “realizes how blessed she is—how blessed we [the members of our family] all are.” For the first century-and-a-half of the republic, however the sons and daughters of presidents often struggled. Historian Michael Beschloss has alluded to their collective misfortune as the “curse of the famous scion.” Several endured accidents or health crises that led to early death. As a group, they experienced much higher rates of alcoholism and mental illness than their peers. Destitution was not uncommon. In the 19th century, a few First Children did achieve success—Lincoln’s eldest son, Robert, eventually became the CEO of the Pullman Palace Car Company and Webb Hayes, the second son of Rutherford B. Hayes, helped to found the corporate behemoth, Union Carbide—but these cases were the exception rather than the rule. First Dads: Parenting and Politics from George Washington to Barack Obama - Kindle edition by Joshua Kendall. Download it once and read it on your Kindle device, PC, phones or tablets. Use features like bookmarks, note taking and highlighting while reading First Dads: Parenting and Politics from George Washington to Barack Obama. In stark contrast to Clinton and Bush, Abigail (“Nabby”) Adams, the eldest child of John Adams, lived in abject poverty for most of her adult life. She suffered through a difficult marriage to William Smith, a mentally unstable former aide-de-camp to George Washington. Smith would repeatedly abandon her and their four children for months—sometimes even years—at a time. In the late 1790s, when a few of Smith’s speculative ventures went belly up, Nabby lived with her husband in a tiny cottage on the grounds of a debtors’ prison. “My dear sister’s destiny might have been better,” Adams’s second son, Thomas, wrote of Nabby, who died of cancer at 48. Nabby’s brother, Charles, Adams’s third son, met an even crueler fate. Though he passed the bar in 1792, the Harvard grad never could make a decent living in his chosen profession. A chronic alcoholic, who was also a serial adulterer, Charles often lived apart from his wife and two daughters. Weighed down by worry over the distress of both Nabby and Charles, John Adams confessed to his wife, Abigail, a couple of years into his administration, “My children give me more pain than all my enemies.” In the fall of 1799, Adams disowned Charles, whom he never spoke to again. A year later, the destitute Charles died of cirrhosis of the liver at the age of 30. While John Quincy Adams, the first-born son of John Adams, was a stratospheric success—before becoming president in 1824, he served two terms as James Monroe’s secretary of state—his eldest son, George Washington Adams, committed suicide a month after the end of his presidency, drowning himself in the Long Island Sound while sailing from Providence to Washington. George, who had worked in Daniel Webster’s Boston law office for a few years, had recently fathered an out-of-wedlock child with a chambermaid. Due to his deep depression, he often spent his days locked in his tiny room where he “lived like a pig,” as one of his brothers put it. After learning of his son’s death, the devastated former president vowed to God to “employ the remaining days which thou has allotted me on earth to purposes…tributary to the well-being of others.” A year later, John Quincy would stage a remarkable comeback as an abolitionist Congressman. Due to his recklessness, John Tyler, Jr., the third of President John Tyler’s eight children with his first wife, was a constant embarrassment to the family. A year after Vice President Tyler succeeded William Henry Harrison, the married John Jr. made a pass at Julia Gardiner, the Long Island beauty who would become his father’s second wife a couple of years later. Tyler ended up firing John Jr., who was then serving as his personal secretary. “The P. [President] says he really believes [John Jr.] part a madman,” Julia wrote. After the Civil War, John Jr. subsisted on a string of lowly patronage posts. “It were better,” concluded a journalist upon his death in 1896, “to be buried alive than to live a life so useless.” Born in the Army base in Fort Knox, Kentucky in 1814, Sarah Taylor was nicknamed “Knox” by her father, Zachary Taylor, the career military man who was elected president in 1848. At eighteen, she fell in love with Jefferson Davis—then a recent West Point graduate stationed in Wisconsin. Her father opposed the union, saying, “I’ll be damned if another daughter of mine shall marry into the army. I know enough of the family life of officers. I scarcely know my own children, or they me.” Despite his objections, she married the future president of the Confederacy in 1835. Three months after the wedding, Knox, who had moved to Louisiana with her husband, died of malaria at the age of 21. In January of 1853, two months before his inauguration, Franklin Pierce, along his wife Jane and his third and only surviving child Benny, boarded a train in Andover, Massachusetts, which crashed soon after it left the station. The 11 year old died instantly. “Gen. Pierce took him up,” the New York Times reported, “he did not think the little boy was dead until he took off his cap.” The Pierces were never the same. “How shall I be able to summon my manhood to gather up my energies for the duties before me, it is hard for me to see,” the devastated president-elect wrote to a friend that month. The First Lady hardly ever appeared in public and spent hours writing letters to her dead son. The loss of Benny affected the nation, as Pierce’s rudderless administration did little to stop America from heading toward a bloody internecine conflict. In May of 1874, 18 year-old Nellie Grant, the only daughter of President Ulysses S. Grant married Englishman Algernon Sartoris in a lavish East Room ceremony. The president was reluctant to approve the union because this minor aristocrat would be taking her back to his native land. “I yielded consent,” Grant stated, “but with a wounded heart.” His fears were well grounded. As Henry James would later put it, Sartoris was a “drunken idiot of a husband,” who often abandoned Nellie and their three children by carrying on affairs with other women all over the globe. After Sartoris’s death a decade later, the miserable Nellie moved into her mother’s home in Washington. Soon after her second marriage in 1912, Nellie suffered a stroke, which left her paralyzed for the last seven years of her life. Theodore Roosevelt’s eldest child, Alice, evolved into a vibrant Washington socialite who hobnobbed with presidents until her death at 96. But his four sons, all of whom served heroically in the armed forces, fared much less well. After fighting in both Mesopotamia against the Turks and in France against the Germans in World War I, TR’s second son, Kermit, ran the Roosevelt Steamship Company. A decade later, however, he succumbed to alcoholism and depression—afflictions for which his elder brother, Archie, had sent him to a mental hospital. Though Kermit was over 50 when World War II started, he was still eager to return to the battlefield. Fully aware of Kermit’s frail health, Army Chief of Staff George Marshall sent him to a post in Alaska where he was unlikely to do any fighting.  In June of 1943, Kermit shot himself in the head “due to despondence resulting from exclusion from combat duties.” Of his six children, Theodore Roosevelt felt closest to Quentin, his youngest, who was born in 1898. Of the avid reader and natural athlete, TR once remarked, “There is something very Theodore about all this.” Like his three older brothers, Quentin jumped at the chance to serve in World War I.  In the spring of 1917, after finishing his sophomore year at Harvard, Quentin headed to France. A year later, he saw action as a fighter pilot. On July 14, 1918, the Germans shot him down. The former president was crushed. “Since Quentin’s death,” TR said in the fall of 1918, “the world seems to have shut down upon me.” The heart-broken former president died a few months later. The eldest of Woodrow Wilson’s three daughters, Margaret Wilson had a delicate constitution.  “She has been a nervous child her whole life and is evidently unfitted by temperament to take a full college course,” her mother, Ellen Wilson, wrote to the Dean of Goucher College, which Margaret left after two years. After Wilson became president in 1913, Margaret took voice lessons to become a professional lieder singer. In 1918, after spending several months entertaining the troops in France, she suffered a nervous breakdown, which ended her performing career.  For most of the 1920s, Margaret, who never married or found another vocation, was a lost soul. In fact, in the last year of her father’s presidency, she was almost kicked off the Fifth Avenue bus because she did not have the dime fare. (A sympathetic driver, who had no idea who she was, decided to lend her the fare.) A decade later, she discovered Hindu philosophy and went to live in an ashram in South India where she died of uremia. Joshua Kendall is the author of First Dads: Parenting and Politics from George Washington to Barack Obama, which is coming out in May.
69c2610257763d1717f89e1ad2170075
https://www.smithsonianmag.com/history/first-moments-hitlers-final-solution-180961387/
The First Moments of Hitler’s Final Solution
The First Moments of Hitler’s Final Solution Before the start of World War II, around 9.5 million Jewish people lived in Europe. By the time the war ended, the Nazis had killed 6 million European Jews in concentration camps, or pogroms, or ghettos, or mass executions in what we refer to today as the Holocaust. The Nazis used the term Endlösung, or Final Solution, as the “answer” to the “Jewish question.” But when did this monstrous plan get put in motion? Adolf Hitler had provided clues to his ambition to commit mass genocide as early as 1922, telling journalist Josef Hell, “Once I really am in power, my first and foremost task will be the annihilation of the Jews.” But how he would enact such a plan wasn’t always clear. For a brief period, the Führer and other Nazi leaders toyed with the idea of mass deportation as a method of creating a Europe without Jews (Madagascar and the Arctic Circle were two suggested relocation sites). Deportation still would’ve resulted in thousands of deaths, though perhaps in less direct ways. When exactly Hitler settled on straightforward murder as a means of removal has been harder to pinpoint. As Yale historian Timothy Snyder writes, “It cannot be stressed enough that the Nazis did not know how to eradicate the Jews when they began the war against the Soviet Union [in the summer of 1941]… They could not be confident that SS men would shoot women and children in large numbers.” But as Operation Barbarossa, the name for the Nazi invasion of the U.S.S.R, proved during the mass shootings of June 1941 and the massacres at Kiev in September, the Order Police and Einsatzgrüppen were more than willing to commit mass murders. This meant Hitler could take the solution to the Jewish problem to its “furthest extremes,” in the words of Philipp Bouhler, the senior Nazi official responsible for the euthanasia program that killed more than 70,000 handicapped German people. According to scholars Christian Gerlach and Peter Monteath, among others, the pivotal moment for Hitler’s decision came on December 12, 1941, at a secret meeting with some 50 Nazi officials, including Joseph Goebbels (Nazi minister of propaganda) and Hans Frank (governor of occupied Poland). Though no written documents of the meeting survive, Goebbels described the meeting in his journal on December 13, 1941: “With respect of the Jewish Question, the Führer has decided to make a clean sweep. He prophesied to the Jews that if they again brought about a world war, they would live to see their annihilation in it. That wasn’t just a catchword… If the German people have now again sacrificed 160,000 dead on the eastern front, then those responsible for this bloody conflict will have to pay with their lives.” In addition to Goebbels’s diary entry, historians cite the notes of German diplomat Otto Brautigam, who on December 18, 1941, wrote that “as for the Jewish question, oral discussions have taken place [and] have brought about clarification.” This meeting, which would be followed by the January 1942 Wannsee Conference (where the decision on exterminating all European Jews was further reinforced), was hardly the start of violence against Jews. Attacks had been happening in Nazi Germany’s occupied territories for years. What differentiated this period from earlier attacks was “an escalation of murder,” says Elizabeth White, historian at the United States Holocaust Memorial Museum. “At some point I think, with the development of killing centers, [the Nazis] felt that they had the means and opportunity to realize the vision of a Jew-free Europe now rather than wait until after Germany had won [the war].” Australian historian Peter Monteath echoes that conclusion, writing in 1998 that the December 12 decision “made it clear that the principle of killing Jews in the occupied territories in the east was to be extended to all European Jews, including those in Germany and Western Europe.” In the decades that followed the Nuremburg Trials, in which Nazi officials, charged with crimes against peace and humanity, hid behind the excuse that they were just following orders, historians grappled with questions of blame and guilt. Had Hitler and top Nazi officials been solely responsible for the genocide? How complicit were lower-level Nazis and members of the Order Police? “We had big gaps in our knowledge because most of the documentation about how the genocide was carried out on the ground was captured by the Soviet Red Army and wasn’t available until after the Cold War,” says White. The fall of the Soviet Union led to a feast of wartime bureaucratic records, allowing historians to realize how much leeway Nazi officials were given. It became readily clear that the number of Nazis involved in enacting the Final Solution was much larger than previously believed. “The way Hitler worked was he would make these pronouncements, and people would go off and figure out, what did he mean? How are we going to do this?” says White. “You could work towards the Führer by being innovative and ruthless.” In other words, rather than giving explicit orders to each member of the Nazi party, Hitler made numerous statements vilifying Jewish people and declaring the need to exterminate them. After the December 12 meeting, these proclamations took a more precise tone: the Nazis needed to kill all Jews, including German Jews and Western European Jews, and they needed to do so systematically. What had started as uncertain and sporadic violence quickly turned into wholesale slaughter, complete with gas chambers and concentration camps. Six weeks later, SS chief Heinrich Himmler, the Nazi official responsible for the implementation of the Final Solution, ordered the first Jews of Europe to Auschwitz. The Holocaust had truly begun. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
1e2efa9b21af6ed79dbddb057f028146
https://www.smithsonianmag.com/history/first-novel-children-taught-girls-power-reading-180968765/
The First Novel for Children Taught Girls the Power of Reading
The First Novel for Children Taught Girls the Power of Reading “Consider with me, what is the true use of reading,” begins Sarah Fielding in the preface to her 1749 book The Governess. “If you can fix this truth in your minds, namely, that the true use of books is to make you wiser and better, you will have both profit and pleasure form what you read.” The readers Fielding addressed, and the characters in her book, were all girls. At a time when the literacy rate for women in England was around 40 percent, author Sarah Fielding wanted a different future for women. She not only wanted girls to read, but also to organize that knowledge in their minds to their own benefit. “For young women, reading, and reading novels in particular, was seen as a dangerous pastime,” says Candace Ward, a Florida State University professor of English who edited a recent edition of The Governess. “Sarah Fielding is suggesting there’s more to these works than simply fantastic romance.” It was a serendipitous moment for a woman to be entering the world of letters. Despite the lingering stigma attached to women in writing, Fielding was far from the first to dive into the fray. Feminist writer Mary Astell had been arguing for women’s right to education since the late 1600s, English novelist Eliza Haywood began publishing her work in the 1720s, and poet Martha Sansom wrote regularly about her desire not to be hemmed in by the domestic sphere around the same time. While little is known about the details of Fielding’s life, it’s clear that she was born in 1710 to a family of seven. Despite a turbulent family dynamic (her father eventually died in debtor’s prison), Fielding earned a meager education at a girls’ boarding school. She then built upon that knowledge through friends, tutors, and of her own initiative, learning to write literary criticism and read Greek and Latin. But Fielding never married, and had little inheritance on which to live. Instead of relying solely on the charity of friends, Fielding turned to writing as a means of supporting herself. “Women were very active in publishing and their prose narrative writing was as influential on the emergent novel form as writers like Daniel DeFoe or Samuel Richardson,” Ward says. In fact, Fielding regularly engaged with those male writers, writing criticism of Richardson’s work and earning his praise as the “much-esteemed Sally Fielding.” Fielding also worked with her brother, Henry, a novelist whose works include Tom Jones. Thanks to her brother’s encouragement, Sarah published her first novel in 1744, called The Adventures of David Simple. Although initially published anonymously, the book was critically acclaimed and so popular that it quickly went into a second edition. It also offered Fielding the opportunity to continue her pursuit of writing, now with her name attached to the works. Despite the acceptance of her contemporaries, Fielding’s position as a female author was still unusual. Women typically had far less education than men of the period, and so could rarely find work outside the home. They were instead expected to be “a sweet-tempered, dependable helpmate, responsible for maintaining the moral and spiritual values of the home, shunning decorative excess while remaining graceful, attractive, and nurturing,” writes literary scholar Arlene Fish Wilner. The other complication in Fielding’s work was the still-young form of the novel. Until that point, fictional prose mostly came in the form of Romance, not in the modern sense of Fabio and his windswept woman, but in the medieval tradition of knights and fair maidens. Writers who wanted to try their hand at this new form followed some unwritten rules to distinguish their writing from earlier Romances: the stories were grounded in realism, used familiar rather than lofty language, and had protagonists that readers could identify with, Ward says. At the same time, authors of the new genre meant felt they had to prove the value of their work. “There was a perception that you had to justify writing the novel,” Ward says. That meant including a lesson for readers to imbibe. This could take the form of symbolic characters, who personified good or evil rather than being fully-dimensional, or a literal moral tacked on to the end of the text. In the case of Fielding’s The Governess, the text very clearly offered examples of positive and negative traits. Comprised of 20 narratives, including fables, fairy tales, and autobiographies from each of the girls in the school, The Governess, through its titular character Mrs. Teachum, demonstrates which behaviors are acceptable, and which faults the girls must strive to overcome (like vanity, selfishness and fear). All of these stories are exchanged among the girls and then discussed to unearth the lesson to be learned. “Mrs. Teachum’s pupils listen to each story, and the frame allows Mrs. Teachum (or sometimes her pupil Jenny Peace) to correct any potential misreadings,” writes Patrick Fleming, a professor of English. “With the exception of Jenny, none of the girls is entirely virtuous while at Mrs. Teachum’s school. Each has improved since before arriving, but none has reached moral maturity.” In other words, Fielding used her knowledge to encourage other girls to earn their own education, while also helping them cultivate the traits that would’ve made them acceptable wives—still perhaps the most important economic factor in a woman’s life. The underlying tension between those two ideas—independence versus reliance on a husband—is something Fielding struggled with throughout her career. “To be visible or to be invisible—which of those states is going to lead to fulfillment or happiness or just a peaceful life?” ward says. “I think Fielding struggled with that question throughout all of her writings. [A traditional domestic setting] seems pretty desirable for Fielding, and on the other hand there’s this chafing against that.” In the end, Fielding had no choice but to continue her writing career. The Governess was an incredible success and the first of its kind; only five years earlier had John Newberry published A Little Pretty Pocket-Book, considered the first children’s book (and distinct from Fielding’s work, which was the first children’s novel). Unfortunately even the success of that book wasn’t enough to bring her total financial security. Fielding continued to get by with the support of friends, but was never fully independent, despite the popularity of her work. By time of Fielding’s death in 1768, The Governess was in its fifth edition and would remain in print for more than 150 years. For Ward, recognizing the role Fielding and other women writers of the period played is an important step in correcting the historic record. For years, scholars mainly focused on men’s writing and dismissed women altogether. But almost as important is the role that 18th century writing played in the development of modern thought. “What we have we inherited from 18th-century England,” Ward says. “Our institutions, our ideas about education, about work, it’s all really grounded in the 18th century. That’s when the ideas were articulated in ways we recognize. They went into the formation of our founding documents.” Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
25a668a73669a180ebd95d12b6924b9a
https://www.smithsonianmag.com/history/five-new-nonfiction-books-read-while-youre-stuck-home-180974524/
Five New Nonfiction Books to Read While You’re Stuck at Home
Five New Nonfiction Books to Read While You’re Stuck at Home As global efforts to contain the novel coronavirus pandemic continue, millions of people around the world are practicing social distancing and staying indoors. To help those in need of distraction from this stark new reality, Smithsonian magazine has compiled an array of resources, including articles on cultural, historical and scientific collections you can explore online; museums you can virtually visit; and ways to experience the Smithsonian Institution from home. Now, we’re adding another offering to the list. Each Friday, Smithsonian will publish a roundup of five newly released nonfiction books in the fields of history, science, arts and culture, innovation and travel. Selections represent texts that piqued our curiosity with their new approaches to oft-discussed topics, elevation of overlooked stories and artful prose. We’ve linked to Amazon for your convenience, but be sure to check with your local bookstore to see if it supports social distancing-appropriate delivery or pickup measures, too. For Ruby Laura Madison Wilson, her family’s ties to President James Madison had long been a point of pride. “Always remember—you’re a Madison,” she told her daughter, author Bettye Kearse. “You come from African slaves and a president.” Kearse, however, felt differently. She was unable to separate her DNA from the “humiliation, uncertainty, and physical and emotional harm” experienced by her enslaved ancestor, a woman named Coreen who was, in fact, the Founding Father’s half-sister. According to family tradition, as passed down by generations of griot oral historians, Madison raped Coreen, who gave birth to a son, Jim, around 1792. Jim’s son, Emanuel Madison, was Kearse’s great-great-grandfather. The Other Madisons marks the culmination of Kearse’s 30-year investigation into not only her own family history, but that of other enslaved and free African Americans whose voices have been silenced over the centuries. Though she lacks conclusive DNA or documentary evidence linking her to Madison, Kearse hasn’t let this upend her sense of identity. As the retired pediatrician writes on her website, “[H]ow could I prove my family’s story if slaves … were not included as people in the history that mattered to those who created and maintained the records? The problem is not DNA, I realized; the problem is the Constitution.” By the late 1970s and early ’80s, the once-ubiquitous specter of a figure standing at the side of the road, thumb raised in hopes of hitching a ride, had all but disappeared. In Roadside Americans, historian Jack Reid explores hitchhiking’s decline, tracing the practice’s evolution from a common mode of travel to a “taboo form of mobility reserved for desperate and often unsavory individuals.” Between the Great Depression and the mid-1970s, argues Reid, “thumb tripping” served as a manifestation of counterculture, uniting students, activists and travelers of all ages in an act of communal goodwill. But as President Ronald Reagan’s brand of conservatism gained traction in the 1980s, this sense of “trust and social reciprocity,” according to one reviewer, vanished, leading the public to associate the act of hitchhiking with irresponsible behavior, crime, promiscuity and casual drug use. Perceptions of the practice, the author writes, “evolved over time in [sync] with broader economic, political and cultural shifts.” The 1938 Pau Grand Prix has all the trappings of a blockbuster Hollywood film: cars, chaos, colorful characters, a competition between good and evil—in this case France and Nazi Germany. But until Neal Bascomb, best-selling author of The Winter Fortress: The Epic Mission to Sabotage Hitler’s Atomic Bomb, decided to make the race the focus of his latest book, the tale remained little-known. Now, the story of Jewish driver René Dreyfus (nicknamed “Jesse Owens on wheels” in the New York Times’ review of the book); heiress and formidable fellow racer Lucy Schell; Charles Weiffenbach, head of French automaker Delahaye; and Nazi driver Rudolf Caracciola has come roaring to life in truly cinematic fashion. Without spoiling the Grand Prix’s conclusion—though readers can probably predict how the race turned out—know that Faster chronicles what its publisher deems an “inspiring, death-defying” venture that Adolf Hitler soon strove to completely erase from history. Tales of valiant kings’ and princes’ battle prowess abound in mythology and the historical record alike, but as father-daughter duo Jonathan W. Jordan and Emily Anne Jordan argue in The War Queens, male monarchs weren’t the only ones who rallied their armies to victory. From England’s Elizabeth I to Tamar of Georgia, Scythia’s Queen Tomyris, and more modern figures like Indira Gandhi and Golda Meir, women leaders have long defied gender conventions by wielding weapons and protecting their kingdoms. Angola’s Queen Nzinga, for instance, personally led soldiers on the battlefield, using guerrilla warfare tactics to resist Portuguese invaders during the 1640s. Fifteenth-century Italian noblewoman Caterina Sforza, meanwhile, “defended her … holdings with cannon and scimitar,” defying Borgia Pope Alexander VI’s besieging forces for almost a month. In the years between World War I and World War II, American journalists struggled to address many of the same debates that dominate today’s media landscape: democracy versus authoritarianism, interventionism versus isolationism, and objectivity versus propaganda, among others. Nancy F. Cott’s Fighting Words highlights four foreign correspondents—Dorothy Thompson, John Gunther, Vincent Sheean and Rayna Raphaelson Prohme—who wrestled with these issues. Cott draws on details from their personal lives and reporting trips to China, Palestine, Moscow and Berlin to reveal both “the making of the modern self,” in the words of publisher Hachette, as well as the role international reporting played in shaping the United States’ own burgeoning national identity. Having trouble seeing our list of books? Turn off your ad blocker, and you'll be all set. Meilan Solly is Smithsonian magazine's assistant digital editor, humanities. Website: meilansolly.com.
e6cc7fa713f3d70aa22a64d59d6b91d9
https://www.smithsonianmag.com/history/for-studs-terkel-chicago-was-a-city-called-heaven-121647361/
For Studs Terkel, Chicago Was a City Called Heaven
For Studs Terkel, Chicago Was a City Called Heaven Editor's Note, May 16, 2012: Studs Terkel, the Pulitzer-Prize winning author and historian, reflected on the character of the city of Chicago for us in 2006. He died in 2008 at the age of 96. Today would have been his 100th birthday. Hog Butcher for the World, Tool Maker, Stacker of Wheat, Player with Railroads and the Nation's Freight Handler; Stormy, husky, brawling, City of the Big Shoulders... Carl Sandburg, the white-haired old Swede with the wild cowlick, drawled out that brag in 1914. Today, he is regarded in more soft-spoken quarters as an old gaffer, out of fashion, more attuned to the street corner than the class in American studies. Unfortunately, there is some truth to the charge that his dug-out-of-the-mud city, sprung-out-of-the-fire-of-1871 Chicago, is no longer what it was when the Swede sang that song. It is no longer the slaughterhouse of the hang-from-the-hoof heifers. The stockyards have gone to feedlots in, say, Clovis, New Mexico, or Greeley, Colorado, or Logansport, Indiana. It is no longer the railroad center, when there were at least seven awesome depots, where a thousand passenger trains refueled themselves each day; and it is no longer, since the Great Depression of the 1930s, the stacker of wheat. During all these birth years of the 21st century, the unique landmarks of American cities have been replaced by Golden Arches, Red Lobsters, Pizza Huts and Marriotts, so you can no longer tell one neon wilderness from another. As your plane lands, you no longer see old landmarks, old signatures. You have no idea where you may be. A few years ago, while I was on a wearisome book tour, I mumbled to the switchboard operator at the motel, "Please wake me at 6 a.m. I must be in Cleveland by noon." Came the response: "Sir, you are in Cleveland." That Chicago, too, has so been affected is of small matter. It has been and always will be, in the memory of the 9-year-old boy arriving here, the archetypal American city. One year after Warren G. Harding's anointment, almost to the day, the boy stepped off the coach at the La Salle Street depot. He had come from east of the Hudson and had been warned by the kids on the Bronx block to watch out for Indians. The boy felt not unlike Ruggles, the British butler, on his way to Red Gap. Envisioning painted faces and feathered war bonnets. August 1921. The boy had sat up all night, but was never more awake and exhilarated. At Buffalo, the vendors had passed through the aisles. A cheese sandwich and a half-pint carton of milk was all he had during that twenty-hour ride. But on this morning of the great awakening, he wasn't hungry. His older brother was there at the station. Grinning, gently jabbing at his shoulder. He twisted the boy's cap around. "Hey, Nick Altrock," the brother said. He knew the boy knew that this baseball clown with the turned-around cap had once been a great pitcher for the White Sox. The boy's head as well as his cap was awhirl. There was expensive-looking luggage carried off the Pullmans. Those were the cars up front, a distant planet away from the day coaches. There were cool Palm Beach-suited men and even cooler, lightly clad women stepping down from these cars. Black men in red caps—all called George—were wheeling luggage carts toward the terminal. My God, all those bags for just two people. Twentieth Century Limited, the brother whispered. Even got a barbershop on that baby. There were straw suitcases and bulky bundles borne elsewhere. These were all those other travelers, some lost, others excitable in heavy, unseasonal clothing. Their talk was broken English or a strange language or an American accent foreign to the boy. Where were the Indians? This was Chicago, indubitably the center of the nation's railways, as the Swede from Galesburg had so often sung out. Chicago to Los Angeles. Chicago to Anywhere. All roads led to and from Chicago. No wonder the boy was bewitched. Chicago has always been and still is the City of Hands. Horny, calloused hands. Yet, here they came: the French voyageurs; the Anglo traders; the German burghers many of whom were the children of those dreamers who dared dream of better worlds. So it was that the Chicago Symphony Orchestra came into being; one of the world’s most regarded. It was originally Teutonic in its repertoire; now it is universal. They came, too, from Eastern Europe as Hands. The Polish population of Chicago is second only to that of Warsaw. They came from the Mediterranean and from below the Rio Grande; and there was always the inner migration from Mississippi, Arkansas, Louisiana and Tennessee. The African-American journalist, grandson of slaves, spoke with a touch of nostalgia, memories of his hometown, Paris. That is, Paris, Tennessee. "Out in the fields, we'd hear the whistle of the Illinois Central engineer. OOOweee! There goes the IC to—Chica-a-ago!" It was even referred to in the gospel song "City Called Heaven." The city called heaven, where there were good jobs in the mills and you did not have to get off the sidewalk when a white passed by. Jimmy Rushing sang the upbeat blues, "Goin' to Chicago, Baby, Sorry I Can’t Take You." Here I came in 1921, the 9-year-old, who for the next 15 years lived and clerked at the men's hotel, the Wells-Grand. (My ailing father ran it, and then my mother, a much tougher customer, took over.) To me, it was simply referred to as the Grand, the Chicago prototype of the posh pre-Hitler Berlin Hotel. It was here that I encountered our aristocrats as guests: the boomer firemen, who blazed our railroad engines; the seafarers who sailed the Great Lakes; the self-educated craftsmen, known as the Wobblies but whose proper name was the Industrial Workers of the World (IWW). Here in our lobby, they went head-to-head with their bêtes noires, the anti-union stalwarts, who tabbed the IWW as the acronym of "I Won't Work." Oh, those were wild, splendiferous debates, outdoing in decibel power the Lincoln-Douglas bouts. These were the Hands of Chicago making themselves heard loud and clear. It was the truly Grand Hotel, and I felt like the concierge of the Waldorf-Astoria. There were labor battles, historic ones, where the fight for the eight-hour day had begun. It brought forth the song: "Eight hours we'd have for working, eight hours we"d have for play, eight hours for sleeping, in free Amerikay." It was in Chicago that the Haymarket Affair took place and four men were hanged in a farcical trial that earned our city the world's opprobrium. Yet it is to our city's honor that our governor, John Peter Altgeld, pardoned the three surviving defendants in one of the most eloquent documents on behalf of justice ever issued. The simple truth is that our God, Chicago's God, is Janus, the two-faced one. One is that of Warner Brothers movie imagination, with Jimmy Cagney and Edward G. Robinson as our sociopathic icons. The other is that of Jane Addams, who introduced the idea of the Chicago Woman and world citizen. It was Chicago that brought forth Louis Sullivan, whom Frank Lloyd Wright referred to as Lieber Meister. Sullivan envisioned the skyscraper. It was here that he wanted to touch the heavens. Nor was it any accident that young Sullivan corresponded with the elderly Walt Whitman, because they both dreamed of democratic vistas, where Chicago was the city of man rather than the city of things. Though Sullivan died broke and neglected, it is his memory that glows as he is recalled by those who followed Wright. What the 9-year-old boy felt about Chicago in 1921 is a bit more mellow and seared. He is aware of its carbuncles and warts, a place far from Heaven, but it is his town, the only one he calls home. Nelson Algren, Chicago's bard, said it best: "Like loving a woman with a broken nose, you may well find lovelier lovelies. But never a lovely so real."
57e76c9c4079456ac0ed08f71fc144ec
https://www.smithsonianmag.com/history/forgotten-history-womens-football-180958042/
The Forgotten History of Women’s Football
The Forgotten History of Women’s Football This season, to much media acclaim, the National Football League hired its first female referee and two female coaches (one intern and one full-time). Considering the attention these additions generated, one could be forgiven for thinking that women were just beginning to get into the sport at any professional level. But in the period between World War I and II, a women’s football league was nearly popular enough to become mainstream entertainment. Little is known about this short-lived women’s football craze. What remains are the photographs of players from Los Angeles that appeared in two national magazines: first Life in November 1939, then Click the following January. The black-and-white photos showed tough-looking women dressed in full football uniforms, including helmets, pants and shoulder pads. The magazine articles provided hardly any information about these pioneering athletes, though, other than assuring readers that they played “hard, fast” regulation tackle football. Who were these women, and why did their football league disappear after only one season, despite the Click article’s assertion that it would be expanding nationwide the next fall? Many might assume that public disapproval was to blame. “There was an identification of football with masculinity, much like boxing or wrestling. For women to be playing it would have been seen as an extreme violation of the gender norms,” says football historian Michael Oriard, who mentions the Los Angeles players in his 2004 book King Football. There may be a different explanation for the league’s demise, though, one that has more to do with the women themselves. My first clue was one of the teams referred in the Life article: the Marshall-Clampett Amazons. I had come across a Marshall-Clampett fastpitch softball team from Los Angeles while researching my book on the history of the sport (the team was named after its car dealership sponsor). A search of the California newspaper archives turned up an article in the Palm Springs Desert Sun that confirmed the Marshall-Clampett football and softball teams were not only related but were, in fact, the same team—the football players featured in the Life article were actually softball players first and foremost. It’s likely that the three other teams in the Los Angeles women’s football league were composed of softball players, too. In a 2013 blog post, Melitas Forster, a former Marshall-Clampett player, recalled that a softball promoter had organized the football league. Women’s softball was extremely popular in the late 1930s, especially in Los Angeles, where Hollywood celebrities attended games. The same Desert Sun article discussed a charity softball game between the team and a men’s squad that included silent film star Buster Keaton. (Incidentally, the Marshall-Clampett players wound up defeating Buster Keaton’s Palm Springs team, 5-4.) The women’s football games appear to have been an attempt to capitalize on this popularity and extend ticket sales from fall, when the softball season ended, into winter. If this was indeed the plan, it worked. In addition to attracting national media attention, the games drew crowds of 3,000 or more. There were some negative reactions to the women’s football games. A news wire article published November 1939 described them as an invasion of “one of the last strongholds of masculinity.” The Life article also argued that football was too dangerous for women, warning that “a blow either on the breasts or in the abdominal region may result in cancer or internal injury.” Still, the more likely reason the Los Angeles league ended was that the players were already committed to softball, which offered considerably more opportunities than football did. Being featured in national magazines paled in comparison to the perks that came with being a 1930s Los Angeles softball player, which included traveling to overseas destinations, such as Japan, and getting to appear in movies, such as the 1937 Rita Hayworth film Girls Can Play. Though football was likely more dangerous, the Los Angeles players still played physical enough of a softball game to incur strained muscles and the occasional concussion. But there was little incentive for them to risk hurting themselves playing unless the league expanded, and it didn’t. “Let the boys get their heads kicked off. We’ll stick to softball,” some of the players told the Orange County, California, Daily News. A year later, in the summer of 1941, a second women’s football league attempted to form. This time the setting was Chicago, and once again, many of the players came from softball. The teams only played a few games, though, and they received little publicity other than a few local newspaper articles. By the next year, with the U.S. entrance to World War II, talk of women’s football mostly disappeared until the 1970s, when a semi-professional league based primarily in Ohio and Texas emerged. This league received more media coverage, with articles in magazines, such as Texas Monthly, Ebony and Jet. It failed to reach a wide audience, though, and, like the 1939 California league, was soon forgotten. The cost-benefit analysis that the Amazons of the 1930s made has had a modern-day resurgence. Retired football player Antwaan Randle-el told The Washington Post that if faced with the decision again to play football or play baseball (he was drafted in the 14th round by the Chicago Cubs), he would select baseball, citing football’s physical toll. And with the inherent dangers of playing football becoming daily news material, it’s unclear that a professional women’s football league will ever again take hold.
ba70d2ea5f150dde9095b987d20eb94f
https://www.smithsonianmag.com/history/four-hundred-years-black-history-keisha-blain-180976880/
How to Tell 400 Years of Black History in One Book
How to Tell 400 Years of Black History in One Book In August of 1619, the English warship White Lion sailed into Hampton Roads, Virginia, where the conjunction of the James, Elizabeth and York rivers meet the Atlantic Ocean. The White Lion’s captain and crew were privateers, and they had taken captives from a Dutch slave ship. They exchanged, for supplies, more than 20 African people with the leadership and settlers at the Jamestown colony. In 2019 this event, while not the first arrival of Africans or the first incidence of slavery in North America, was widely recognized as inaugurating race-based slavery in the British colonies that would become the United States. That 400th anniversary is the occasion for a unique collaboration: Four Hundred Souls: A Community History of African America, 1619-2019, edited by historians Ibram X. Kendi and Keisha N. Blain. Kendi and Blain brought together 90 black writers—historians, scholars of other fields, journalists, activists and poets—to cover the full sweep and extraordinary diversity of those 400 years of black history. Although its scope is encyclopedic, the book is anything but a dry, dispassionate march through history. It’s elegantly structured in ten 40-year sections composed of eight essays (each covering one theme in a five-year period) and a poem punctuating the section conclusion; Kendi calls Four Hundred Souls “a chorus.” The book opens with an essay by Nikole Hannah-Jones, the journalist behind the New York Times’ 1619 Project, on the years 1619-1624, and closes with an entry from Black Lives Matter co-creator Alicia Garza writing about 2014-19, when the movement rose to the forefront of American politics. The depth and breadth of the material astounds, between fresh voices, such as historisn Mary Hicks writing about the Middle Passage for 1694-1699, and internationally renowned scholars, such as Annette Gordon-Reed writing about Sally Hemings for 1789-94. Prominent journalists include, in addition to Hannah-Jones, The Atlantic’s Adam Serwer on Frederick Douglass (1859-64) and New York Times columnist Jamelle Bouie on the Civil War (1864-69). The powerful poems resonate sharply with the essays, Chet’la Sebree’s verses in “And the Record Repeats” about the experiences of young black women, for example, and Salamishah M. Tillet’s account of Anita Hill’s testimony in the Senate confirmation hearings for Supreme Court Justice Clarence Thomas. “We are,” Kendi writes in the introduction collectively of black Americans, “reconstructing ourselves in this book.” The book itself, Blain writes in the conclusion, is “a testament to how much we have overcome, and how we have managed to do it together, despite our differences and diverse perspectives.” In an interview, Blain talked about how the project and the book’s distinctive structure developed, and how the editors imagine it will fit into the canon of black history and thought. A condensed and edited version of her conversation with Smithsonian is below. Four Hundred Souls is a unique one-volume “community” history of African Americans. The editors, Ibram X. Kendi and Keisha N. Blain, have assembled 90 brilliant writers, each of whom takes on a five-year period of that four-hundred-year span. How did the Four Hundred Souls book come about? We started working on the project in 2018 (it actually predates the [publication of] the New York Times 1619 Project.) Ibram reached out to me with the idea that with the 400th year anniversary of the first captive Africans arriving in Jamestown, maybe we should collaborate on a project that would commemorate this particular moment in history, and look at 400 years of African American history by pulling together a diverse set of voices. The idea was that we'd be able to create something very different than any other book on black history. And as historians, we were thinking, what would historians of the future want? Who are the voices they would want to hear from? We wanted to create something that would actually function as a primary source in another, who knows, 40 years or so—that captures the voices of black writers and thinkers from a wide array of fields, reflecting on both the past but also the present too. Did you have any models for how you pulled all these voices together? There are a couple of models in the sense of the most significant, pioneering books in African American history. We thought immediately of W.E.B. De Bois' Black Reconstruction in America in terms of the scope of the work, the depth of the content, and the richness of the ideas. Robin D.G. Kelley's Freedom Dreams is another model, but more recent. Martha Jones' Vanguard, is a book that captures decades right of black women's political activism and the struggle for the vote in a way that I think, does a similar kind of broad, sweeping history. Daina Ramey Berry and Kali N. Gross's Black Woman's History of the United States is another. But ours was not a single authored book or even an edited collection of just historians. We didn't want to produce a textbook, or an encyclopedia. We wanted this work to be, as an edited volume, rich enough and big enough to cover 400 years of history in a way that would keep the reader engaged from start to finish, 1619 to 2019. That’s part of the importance of the multiple different genres and different voices we included moving from period to period. How does Four Hundred Souls reflect the concept of a community history? We figured that community would show up in different ways in the narrative, but we were really thinking initially, how do we recreate community in putting this book together? One of the earliest analogies that Ibram used was describing this as a choir. I love this—he described the poets as soloists. And then in this choir, you'd have sopranos, you'd have tenors, and you’d have altos. And so the question was: Who do we invite to be in this volume that would capture collectively that spirit of community? We recognized that we could never fully represent every single field and every single background, but we tried as much as possible. And so even in putting together the book, there was a moment where we said, for example, "Wait a minute, we don't really have a scholar here who would be able to truly grapple with the sort of interconnection between African American History and Native American history." So we thought, is there a scholar, who identifies as African American and Native American and then we reached out to [UCLA historian] Kyle Mays. So there were moments where we just had to be intentional about making sure that we were having voices that represented as much as possible the diversity of black America. We invited Esther Armah to write about the black immigrant experience because what is black America without immigrants? The heart of black America is that it's not homogenous at all—it's diverse. And we tried to capture that. We also wanted to make sure that a significant number of the writers were women, largely because we acknowledge that so many of the histories that we teach, that we read, and that so many people cite are written by men. There's still a general tendency to look for male expertise, to acknowledge men as experts, especially in the field of history. Women are often sidelined in these conversations. So we were intentional about that, too, and including someone like Alicia Garza, one of the founders of Black Lives Matter, we wanted to acknowledge the crucial role that black women are playing in shaping American politics to this very day. How did historians approach their subjects differently than say, creative writers? One of the challenges with the book, which turned out to be also an opportunity, was that we were focusing on key historical moments, figures, themes and places in the United States, each within in a very specific five-year period. We actually spent a lot of time mapping out instructions for authors. It wasn't just: “Write a piece for us on this topic.” We said, “Here's what we want and what we don't want. Here's what we expect of you ask these questions as you're writing the essay, make sure you're grappling with these particular themes.” But they also had to have a bit of freedom, to look backward, and also to look forward. And I think the structure with a bit of freedom worked, it was a pretty nice balance. Some essays the five years just fit like a glove, others a little less so but the writers managed to pull it off. We also spent a lot of time planning and carefully identifying who would write on certain topics. “Cotton,” which memoirist Kiese Laymon wrote about for 1804-1809, is a perfect example. We realized very early that if we asked a historian to write about cotton, they would be very frustrated with the five-year constraint. But when we asked Kiese, we let him know that we would provide him with books on cotton and slavery for him to take a look at. And then he brought to it his own personal experience, which turned out to be such a powerful narrative. He writes, “When the land is freed, so will be all the cotton and all the money made off the suffering that white folks made cotton bring to Black folks in Mississippi and the entire South.” And so that's the other element of this too. Even a lot of people wondered how we would have a work of history with so many non-historians. We gave them clear guidance and materials, and they brought incredible talent to the project. The New York Times’ 1619 project shares a similar point of origin, the 400th anniversary of the arrival of enslaved Africans to colonial America. What did you make of it when it came out last year? When the 1619 Project came out, [Ibram and I] were thrilled, because actually, it, in so many ways, complemented our vision for our project. Then we decided we really had to invite Nikole Hannah-Jones to contribute. We weren't sure who we would ask for that first essay, but then we were like, "You know what? This makes sense." I know there are so many different critiques, but for me, what is most valuable about the project is the way that it demonstrates how much, from the very beginning, the ideas and experiences of black people have been sidelined. This is why we wanted her to write her essay [about the slave ship White Lion.] Even as someone who studied U.S. history, I did not even know about the White Lion for many years. I mean, that's how sad it is…but I could talk about the Mayflower. That was part of the history that I was taught. And so what does that tell us? We don't talk about 1619 the way that we do 1620. And why is that? Well, let's get to the heart of the matter. Race matters and racism, too, in the way that we even tell our histories. And so we wanted to send that message. And like I said, to have a complementary spirit and vision as the 1619 Project. When readers have finished going through 400 Souls, where else can they read black scholars writing on black history? One of the things that the African American Intellectual History Society [Blain is currently president of the organization] is committed to doing is elevating the scholarship and writing of Black scholars as well as a diverse group of scholars who work in the field of Black history, and specifically Black intellectual history. Black Perspectives [an AAIHS publication] has a broad readership, certainly, we're reaching academics in the fields of history and many other fields. At the same time, a significant percentage of our readers are non-academics. We have activists who read the blog, well known intellectuals and thinkers, and just everyday lay people who are interested in history, who want to learn more about black history and find the content accessible. Karin Wulf is executive director of the Omohundro Institute of American History & Culture and a professor of history at William & Mary.
034a2f961e6c90b8443108756966205d
https://www.smithsonianmag.com/history/francis-scott-key-the-reluctant-patriot-180937178/
Francis Scott Key, the Reluctant Patriot
Francis Scott Key, the Reluctant Patriot One by one, the buildings at the heart of the American government went up in flames. On the evening of August 24, 1814, British troops torched the Capitol, the Treasury, the President’s House (not yet called the White House). All burned ferociously, as did the structures housing the War and the State departments. Battle-hardened redcoats had overwhelmed and scattered the largely untrained and poorly led American militiamen and regulars deployed to stop them from reaching the capital. President James Madison, along with his attorney general and secretary of state, had fled to safety across the Potomac River. Reporting news of the rout, the LondonCourier crowed: “War America would have, and war she has got.” As the flames rose across the capital on that sweltering August evening, the American government’s decision two years earlier to declare war on Britain—in a conflict that would come to be known as the War of 1812—seemed foolhardy and self-destructive. England remained a mighty world power, while the fledgling United States was strapped for cash, plagued by domestic discord and militarily weak. Donald Hickey, author of The War of 1812: A Forgotten Conflict, says, “The Army was understaffed, untrained, poorly equipped and led by superannuated and incompetent officers. The Navy was just plain outmatched by the Royal Navy.” The British had been largely responsible for provoking hostilities. Locked in a fierce struggle for global domination with Emperor Napoleon’s France, they brazenly interfered with neutral America’s lucrative maritime commerce with Europe by seizing American ships and forcing kidnapped American seamen to meet the need for manpower on British naval vessels. “At this point,” says historian Douglas Egerton, author of Gabriel’s Rebellion and other works on antebellum America, “England still regarded American trade as part of their domain—even after the Revolution. Britain wanted to prevent American foodstuffs and other goods from reaching France; they needed to cut off that trade in order to help them win against Napoleon.” No matter how unequal the balance of power was between the United States and Great Britain, President Madison nevertheless condemned Britain’s “progressive usurpations and accumulating wrongs,” asserting that such outrages would not be tolerated by a nation that had earned its right to international respect through victory in the American Revolution three decades earlier. From the moment hostilities commenced, in July 1812, British naval ships engaged U.S. vessels along the Eastern Seaboard, and British and American forces began skirmishing along the Northwest frontier and in Canada. In Congress, the hawks advocated an attempt to annex Canada, thereby reducing British influence in the contested Northwest. Thomas Jefferson, the former president, predicted that such a venture would be “a mere matter of marching.” The torching of the capital was said to be in retaliation for the burning of buildings in York (near present-day Toronto) by American troops earlier in the war. Now, dismay and anxiety reverberated across the country. Would New York be next? Philadelphia? The Royal Navy could put troops ashore anywhere along the AtlanticCoast. Despite such forebodings, the burning of Washington did not herald disaster for the floundering American cause. Instead, it turned out to be the prelude to one of the most celebrated expressions of patriotic fervor in the young country’s history: Francis Scott Key’s composition of “The Star-Spangled Banner,” written following the British attack on BaltimoreHarbor three weeks after the assault on the capital. After setting Washington ablaze and raiding adjoining Alexandria, Virginia, the British turned on Baltimore, 40 miles north. They confidently expected America’s third largest city (exceeded in population only by New York and Philadelphia) to fall as easily as the capital. A Royal Navy fleet proceeded from the Chesapeake Bay into the wide mouth of the PatapscoRiver and positioned itself to bombard FortMcHenry at the entrance to BaltimoreHarbor. It was to be a coordinated land-sea operation. Once the fort had been silenced, British strategists predicted, the redcoats would take and plunder Baltimore, attempting to underscore the futility of any further challenge by the Americans. The British launched a withering bombardment of FortMcHenry on a rainy September 13. For much of the onslaught, shells and rockets fell on the fort at the rate of almost one a minute. American major George Armistead, commander of FortMcHenry, estimated that “from fifteen to eighteen hundred shells” were fired during the attack. At the time, Francis Scott Key, a 35-year-old Washington lawyer and writer of occasional verse, found himself detained on a British ship within sight of the fort. The son of a distinguished judge, he had been born into a family of wealthy plantation owners based in Keymar, Maryland. Key was in British custody due to an incident that had occurred two weeks earlier, when a 65-year-old physician, William Beanes, confronted some British soldiers who had tried to plunder his Upper Marlboro, Maryland, home. One of the soldiers complained to his officers, who had the doctor placed under arrest. He was escorted to one of their vessels in the Chesapeake Bay. Learning of the incarceration through Richard West, his wife’s brother-in-law, Key agreed to act on Beanes’ behalf and received permission from President Madison to try to negotiate his release. On the face of it, Key seemed an unlikely candidate to write what would become the national anthem. He had referred to the conflict as “abominable” and a “lump of wickedness,” siding with the many Americans—a majority, according to Republican South Carolina congressman William Lowndes—who believed that a diplomatic accommodation with Britain could have avoided hostilities altogether. The senate vote in favor of a declaration of war, taken on June 17, 1812, had split 19 to 13, reflecting fundamental differences between members of the largely pro-war Republicans and the largely antiwar Federalists. In the House of Representatives, the vote had been 79 to 49, with Republicans once again in favor. It was the closest vote on any declaration of war in American history. Opposition had been particularly vehement in the Northeast. In New York that autumn of 1812, antiwar Federalist candidates made major electoral gains in Congressional contests. By the waning months of that year, the Massachusetts legislature passed a resolution urging citizens to resist the war effort. Antiwar sentiments ran deep in other parts of the country as well. Key’s friend, maverick Republican congressman John Randolph of Virginia, said the war would be financed by the “blood and treasure of the people.” Critics charged, too, that Congressional “war hawks”—Southern for the most part—were promoting the cause of settlers and speculators who eagerly eyed land in British-held Canada and Spanish Florida. The War of 1812, says historian Hickey, was, even given Vietnam, the most “vigorously opposed war with a foreign power in our history.” When news of the war reached New England, a few days after the June 17 vote in Congress, church bells in many Northeastern towns and villages tolled slowly in mourning, and shopkeepers closed their businesses in protest. By the time hostilities had dragged on for an inconclusive year and a half, delegates from New England convened in Hartford, Connecticut, to debate whether the Northeastern states should secede from the Union and establish a separate American nation. Massachusetts governor Caleb Strong made overtures to the British commander in Halifax, Nova Scotia, Sir John Coape Sherbrooke, to consider prospects for a separate peace. Historian Egerton believes that had the war gone on much longer, that “process of separation would surely have begun.” At the time, he says, “it seemed as if the war could continue indefinitely. From the [New Englanders’] point of view, they had a president who had destroyed their maritime economy and was also getting Americans killed in an unnecessary war.” However opposed to America’s entry into the war he had been, Key had been outraged by British incursions up the Chesapeake, the attack on the nation’s capital and the capture of Beanes. On September 7, 1814, Key, accompanied by American prisoner-of-exchange officer John Skinner, boarded the Tonnant, flagship of the British fleet, where Beanes was being held. They carried with them letters from British officers who had been treated by Beanes after being wounded during a skirmish in Bladensburg, Maryland. Within hours, the Americans had persuaded a British commander, Maj. Gen. Robert Ross, to release the doctor. By then, however, the assault on Baltimore was imminent; the three Americans, guarded by British marines, were obliged to wait out the battle aboard the British sloop some eight miles upriver from Fort McHenry. From the vessel, they anxiously watched the bombardment of the fort through the daylight hours of September 13. According to Key, “It seemed as though mother earth had opened and was vomiting shot and shell in a sheet of fire and brimstone.” But as darkness descended, Key could see little more of the battle than the “red glare” of the enemy’s newly designed gunpowder-propelled Congreve rockets tracing fiery arcs across the sky. “The heavens aglow were a seething sea of flame,” he later wrote to his friend John Randolph. In the “angry sea,” as Key described conditions on that stormy night, the flag-of-truce sloop was “tossed as though in a tempest.” Key was alarmed by the sound of “bombs bursting in air”—British shells detonating short of their target. It seemed unlikely, Key would later recall, that American resistance at the fort could withstand such a pounding. Not until the mists dissipated at dawn September 14 did he learn the outcome of the battle. “At last,” he later wrote, “a bright streak of gold mingled with crimson shot athwart the eastern sky, followed by another, and still another, as the morning sun rose.” Gradually he was able to discern not the British Union Jack that he had feared, but still, defiantly, an American flag, enormous in its dimensions, fluttering in the breeze from the flagpole of an undefeated Fort McHenry. The fort had not fallen: Baltimore remained safe. It was, he later wrote, a “most merciful deliverance.” Major Armistead, the fort commander, could take credit for the flag’s spectacular size, 30 by 42 feet. Leaving no detail to chance in his preparations for the fort’s defense, he envisioned a dramatic emblem, commissioning Baltimore flag maker Mary Young Pickersgill to stitch a banner so large that the enemy would “have no difficulty in seeing it from a distance.” Mrs. Pickersgill had duly supplied the massive flag, sewn of wool bunting. Each of its 15 stars was about two feet across; its 15 stripes were about two feet wide. History does not record with certainty whether the flag Key saw that fateful morning was the one flown during the bombardment itself. Some historians suggest that a 17- by 25-foot storm flag also sewn by Mrs. Pickersgill may have been run up the flagpole during the downpour, consistent with common practice. The famous Star-Spangled Banner—today one of the greatest treasures of the Smithsonian’s National Museum of American History—may not have been raised until first light on September 14. “At dawn on the 14th,” wrote militiaman Isaac Monroe of the Baltimore Fencibles, “our morning gun was fired, the flag hoisted, [and] Yankee Doodle played. . . . ” No thoroughly detailed account of this extraordinary moment exists, but we do know that Key was still aboard the Tonnant when he began composing a verse about the experience—and his relief at seeing the Stars and Stripes still waving. He used the only writing paper at hand: the back of a letter he pulled from his pocket. He had not yet learned that the British commander who’d been Beanes’ liberator, Maj. Gen. Robert Ross, had been killed by a sniper en route to Baltimore. Almost immediately, the entire British fleet began to withdraw. Key and his companions, including Beanes, were released. On their passage back to shore, Key expanded the few lines he had scrawled. In his lodging at a Baltimore inn the following day, he polished his draft into four stanzas. Key’s brother-in-law Joseph Nicholson, a commander of a militia at FortMcHenry, had the poem printed for distribution to the public. Entitled “Defence of Fort M’Henry,” the verse was accompanied by a suggestion that it be set to the music of an English drinking song. Before the week was out, the poem had been reprinted in the pages of the Baltimore Patriot newspaper, which pronounced it a “beautiful and animating effusion” that is destined “long to outlive the impulse which produced it.” Rechristened “The Star-Spangled Banner” soon thereafter, Key’s words were, within weeks, appearing in newspapers across the nation. In England, news of the setback in Baltimore was met with dismay. The London Times called it a “lamentable event.” The British public had grown increasingly critical of the conflict, their frustration compounded by crippling losses to the British economy; the suspension of lucrative trade with America, coupled with the staggering costs Britain had incurred during its war with Napoleon’s France, had spread hardship across the land. “The tax burden on British citizens was crushing,” says historian Hickey. “England had been at war with France for over two decades.” The United States was counting costs too. Confronted with a war-induced financial crisis and the realization that no substantial benefits were likely to accrue as a result of the conflict, President Madison and Congress accepted that the time had come to reach a peace settlement. Negotiations, conducted on neutral ground in Belgium at Ghent, were rapidly concluded; a treaty that provided neither country with major concessions was signed December 24, 1814. No significant territorial exchanges took place. The United States tacitly accepted its failure to annex Canada. As for British harassment of American maritime commerce, most of that had lapsed when the British-French Napoleonic Wars ended with the defeat of the French emperor a few months earlier. Although neither side achieved decisive or lasting military gain, the conflict did have beneficial consequences for the United States. The nation emerged stronger at least internationally. No matter how poorly prepared the United States had been, the government’s readiness to take up arms against a mighty foe substantially enhanced American prestige abroad. Former president Thomas Jefferson said the war demonstrated that “our government . . . can stand the shock of war.” Delaware senator James Bayard expressed a commonly held sentiment when he vowed: “It will be a long time before we are disturbed again by any of the powers of Europe.” Indeed, within a decade, Madison’s successor, James Monroe, formulated the Monroe Doctrine, which put “European powers” on notice that the United States would tolerate no further colonization in the “American continents.” The war had domestic consequences as well. Hickey believes that America actually lost the war “because we did not achieve our war aims—perhaps most significantly, we failed to achieve our territorial ambition to conquer or annex Canada.” In Hickey’s estimation, Madison showed himself to be “one of the weakest war presidents in America’s history” for failing to work effectively with Congress, control his cabinet or provide coherent leadership. But in the public mind his successes—the defense of Fort McHenry and the defeat, against all odds, of a Royal Navy squadron on Lake Champlain—outweighed his shortcomings. The greatest boost to American self-esteem was Gen. Andrew Jackson’s victory in the Battle of New Orleans, which took place after the war had officially ended—the peace treaty having been signed in far-off Belgium more than a week earlier. “Americans were aware of the many failures in the war,” says C. Edward Skeen, author of Citizen Soldiers in the War of 1812, but “to end the war on a high note certainly pumped up American pride,” particularly since “most counted simple survival [in the war] as a victory.” Patriotic emotions had the effect of diminishing, at least temporarily, the political and regional rivalries that had divided Americans since the founding of the nation. Former secretary of the treasury Albert Gallatin, one of the United States negotiators at Ghent, believed his countrymen now felt more American than ever. “They feel and act,” he said, “more like a nation.” That emergent sense of national identity had also acquired a potent emblem. Before the bombardment in BaltimoreHarbor, the Stars and Stripes had possessed little transcendent significance: it functioned primarily as a banner to identify garrisons or forts. Now the flag—and Key’s song inextricably linked to it—had become an emotionally charged symbol. Key’s “land of the free and the home of the brave” soon became a fixture of political campaigns and a staple of July fourth celebrations. Still, more than a century would pass from its composition until the moment in 1931 when President Herbert Hoover officially proclaimed it the national anthem of the United States. Even then, critics protested that the lyrics, lengthy and ornate, were too unfamiliar to much of the public. Others objected that Key’s poem extolled military glory, equating patriotism “with killing and being killed . . . with intense hatreds and fury and violence,” as Clyde Miller, dean of ColumbiaUniversity’s Teachers College, said in 1930. The New York Herald Tribune wrote that the song had “words that nobody can remember to a tune that nobody can sing.” Detractors, including New York civic leader Albert S. Bard, argued that “America the Beautiful” would make for a more suitable, more singable anthem. Despite the carping, Congress and Hoover conferred official status on “The Star-Spangled Banner” on March 3, 1931. Proponents had carried the day only after a campaign that featured two sopranos, backed by a Navy band, demonstrated the song’s “singability” before the House Judiciary Committee. As for the huge flag that inspired the writing of the anthem, it came into fort commander Armistead’s hands not long after the Battle of Fort McHenry and remained in his family’s possession until 1907, when his grandson, Eben Appleton, offered it to the Smithsonian Institution. Today, Smithsonian experts are painstakingly conserving the flag. Enclosed in a climate-controlled laboratory, it is the centerpiece of an exhibition at the National Museum of American History. The treatment, which has taken five years, is expected to be completed this year. Although Francis Scott key was a prolific writer, the only one of his poems to stand the test of time was “The Star-Spangled Banner.” Although it would ultimately elevate him into the pantheon of American heroes, Key was known during his lifetime primarily as a respected figure in legal and political circles. As a friend and adviser to President Andrew Jackson, he helped defuse pre-Civil War confrontations between the federal government and the state of Alabama. A religious man, Key believed slavery sinful; he campaigned for suppression of the slave trade. “Where else, except in slavery,” he asked, “was ever such a bed of torture prepared?” Yet the same man, who coined the expression “the land of the free,” was himself an owner of slaves who defended in court slaveholders’ rights to own human property. Key believed that the best solution was for African-Americans to “return” to Africa—although by then most had been born in the United States. He was a founding member of the American Colonization Society, the organization dedicated to that objective; its efforts led to the creation of an independent Liberia on the west coast of Africa in 1847. Although the society’s efforts were directed at the small percentage of free blacks, Key believed that the great majority of slaves would eventually join the exodus. That assumption, of course, proved to be a delusion. “Ultimately,” says historian Egerton, “the proponents of colonization represent a failure of imagination. They simply cannot envision a multiracial society. The concept of moving people around as a solution was widespread and being applied to Indians as well.” When Key died at 63 on January 11, 1843, the Baltimore American declared that “so long as patriotism dwells amongst us, so long will this Song be the theme of our Nation.” Across America, statues have been erected to his memory. Key’s Georgetown house—where he lived with his wife, Polly, and 11 children—was removed to make way for a highway in 1947. The two-story brick dwelling, a national landmark by any measure, was dismantled and put in storage. By 1955, the building, down to the last brick, had disappeared from its storage site; it is presumed lost to history. By a joint resolution of Congress, a flag has flown continuously since May 30, 1949, over a monument marking his birthplace in Keymar, Maryland. It celebrates Key’s important role in shaping, as historians Bruce and William B. Catton once wrote, Americans’ belief “not merely in themselves but also in their future . . . lying just beyond the western horizon.”
535faf3bed8fa9c30e3b20dd83b9ecce
https://www.smithsonianmag.com/history/frederick-douglass-always-knew-he-was-meant-to-be-free-1-31552633/
Frederick Douglass Always Knew He Was Meant to Be Free
Frederick Douglass Always Knew He Was Meant to Be Free Frederick Douglass was 6 years old when he began his life as a slave. By the standards of slavery, Douglass was often to get favored treatment. But the realities were to include hunger, cold and seeing his fellow slaves savagely beaten. These realities of slavery were an outrage Douglass refused to accept almost from the first day, and the struggle begun then would in time anger and inspire millions of Americans. Escaping to freedom at age 20, Douglass soon established himself in the antislavery movement as a fearless enemy of the slave owner. In his lectures he spoke with wit, erudition and richness of voice to rival Daniel Webster's. On all accounts, Douglass had an astonishing life. Before there was a civil rights movement, he led a movement in New York to desegregate schools. Later, his tireless voice helped lay the groundwork for the emancipation of slaves and for the 15th Amendment. Even at the end of his life, when he had seen the failure of his grand vision that freedom and the vote would win blacks an equal place with whites, he never stopped fighting to end prejudice. In 1894, in one of his greatest speeches, he implored the nation to "put away your race prejudice.... Recognize...that the rights of the humblest citizen are as worthy of protection as are those of the highest, and...your Republic will stand and flourish forever." Richard Conniff, a Smithsonian contributor since 1982, is the author of seven books about human and animal behavior.
45e5ad00b4b899fbced1d9708d3541be
https://www.smithsonianmag.com/history/frost-nixon-and-me-99350263/?no-ist=
Frost, Nixon and Me
Frost, Nixon and Me In May 1976, in a rather dim New York City hotel room filled with David Frost's cigar smoke, the British television personality put an intriguing proposition to me: leave your leafy academic perch for a year and prepare me for what could be a historic interrogation of Richard Nixon about Watergate. This would be the nation's only chance for no holds barred questioning of Nixon on the scandal that drove him to resign the presidency in 1974. Pardoned by his successor, Gerald Ford, Nixon could never be brought into the dock. Frost had secured the exclusive rights to interview him. Thus the prosecution of Richard Nixon would be left to a television interview by a foreigner. I took the job. The resulting Frost-Nixon interviews— one in particular—indeed proved historic. On May 4, 1977, forty-five million Americans watched Frost elicit a sorrowful admission from Nixon about his part in the scandal: "I let down my friends," the ex-president conceded. "I let down the country. I let down our system of government, and the dreams of all those young people that ought to get into government but now think it too corrupt....I let the American people down, and I have to carry that burden with me the rest of my life." If that interview made both political and broadcast history, it was all but forgotten two years ago, when the Nixon interviews were radically transformed into a piece of entertainment, first as the play Frost/Nixon, and now as a Hollywood film of the same title. For that televised interview in 1977, four hours of interrogation had been boiled down to 90 minutes. For the stage and screen, this history has been compressed a great deal more, into something resembling comedic tragedy. Having participated in the original event as Frost's Watergate researcher, and having had a ringside seat at its transformation, I've been thinking a lot lately about what is gained and what is lost when history is turned into entertainment. I had accepted Frost's offer with some reservations. Nixon was a skilled lawyer who had denied Watergate complicity for two years. He had seethed in exile. For him, the Frost interviews were a chance to persuade the American people that he had been done an epic injustice—and to make upwards of $1 million for the privilege. And in David Frost, who had no discernible political philosophy and a reputation as a soft-soap interviewer, Nixon seemed to have found the perfect instrument for his rehabilitation. Although Nixon's active role in the coverup had been documented in a succession of official forums, the absence of a judicial prosecution had left the country with a feeling of unfinished business. To hear Nixon admit to high crimes and misdemeanors could provide a national catharsis, a closing of the books on a depressing episode of American history. For all my reservations, I took on the assignment with gusto. I had worked on the first Watergate book to advocate impeachment. I had taken a year off from teaching creative writing at the University of North Carolina to witness the Ervin Committee hearings of 1973, from which most Americans' understanding of Watergate came, because I regarded the scandal as the greatest political drama of our time. My passion lay in my opposition to the Vietnam War, which I felt Nixon had needlessly prolonged for six bloody years; in my sympathy for Vietnam War resisters, who had been pilloried by the Nixonians; and in my horror over Watergate itself. But I was also driven by my desire for engagement and, I like to think, a novelist's sense of the dramatic. To master the canon of Watergate was a daunting task, for the volumes of evidence from the Senate, the House and various courts would fill a small closet. Over many months I combed through the archives, and I came across new evidence of Nixon's collusion with his aide Charles Colson in the coverup—evidence that I was certain would surprise Nixon and perhaps jar him out of his studied defenses. But mastering the record was only the beginning. There had to be a strategy for compressing two years of history into 90 minutes of television. To this end, I wrote a 96-page interrogation strategy memo for Frost. In the broadcast, the interviewer's victory seemed quick, and Nixon's admission seemed to come seamlessly. In reality, it was painfully extracted from a slow, grinding process over two days. At my suggestion, Frost posed his questions with an assumption of guilt. When Nixon was taken by surprise—as he clearly was by the new material—you could almost see the wheels turning in his head and almost hear him asking himself what else his interrogator had up his sleeve. At the climactic moment, Frost, a natural performer, knew to change his role from inquisitor to confessor, to back off and allow Nixon's contrition to pour out. In Aristotelian tragedy, the protagonist's suffering must have a larger meaning, and the result of it must be enlightenment. Nixon's performance fell short of that classical standard—he had been forced into his admission, and after he delivered it, he quickly reverted to blaming others for his transgressions. (His reversion to character was cut from the final broadcast.) With no lasting epiphany, Nixon would remain a sad, less-than-tragic, ambiguous figure. For me, the transition from history to theater began with a letter from Peter Morgan, the acclaimed British screenwriter (The Queen), announcing his intention to write a play about the Frost-Nixon interviews. Since I loved the theater (and have written plays myself), I was happy to help in what seemed then a precious little enterprise. At lunches in London and Washington, I spilled out my memories. And then I remembered that I had written a narrative of my involvement with Frost and Nixon, highlighting various tensions in the Frost camp and criticizing the interviewer for failing, until the end, to apply himself to his historic duty. Out of deference to Frost, I hadn't published it. My manuscript had lain forgotten in my files for 30 years. With scarcely a glance at it, I fished it out and sent it to Morgan. In the succeeding months I answered his occasional inquiry without giving the matter much thought. I sent Morgan transcripts of the conversations between Nixon and Colson that I had uncovered for Frost. About a year after first hearing from Morgan, I learned that the play was finished and would première at the 250-seat Donmar Warehouse Theatre in London with Frank Langella in the role of Nixon. Morgan asked if I would be willing to come over for a couple of days to talk to Langella and the other actors. I said I'd love to. On the flight to London I reread my 1977 manuscript and I read the play, which had been fashioned as a bout between fading heavyweights, each of whose careers were on the wane, each trying to use the other for resurrection. The concept was theatrically brilliant, I thought, as well as entirely accurate. A major strand was the rising frustration of a character called Jim Reston at the slackness of a globe-trotting gadfly called David Frost. Into this Reston character was poured all the anger of the American people over Watergate; it was he who would prod the Frost character to be unrelenting in seeking the conviction of Richard Nixon. The play was a slick piece of work, full of laughs and clever touches. For the play's first reading we sat round a simple table at the Old Vic, ten actors (including three Americans), Morgan, me and the director, Michael Grandage. "Now we're going to go around the table, and everyone is going to tell me, 'What was Watergate?'" Grandage began. A look of terror crossed the actors' faces, and it fell to me to explain what Watergate was and why it mattered. The play, in two acts, was full of marvelous moments. Nixon had been humanized just enough, a delicate balance. To my amusement, Jim Reston was played by a handsome 6-foot-2 triathlete and Shakespearean actor named Elliot Cowan. The play's climax—the breaking of Nixon—had been reduced to about seven minutes and used only a few sentences from my Colson material. When the reading was over, Morgan turned to Grandage. "We can't do this in two acts," he said. The emotional capital built up in Act I would be squandered when theatergoers repaired to the lobby for refreshments and cellphone calls at intermission. Grandage agreed. I knew not to argue with the playwright in front of the actors. But when Morgan and I retreated to a restaurant for lunch, I insisted that the breaking of Nixon happened too quickly. There was no grinding down; his admission was not "earned." I pleaded for the inquisition to be protracted, lengthened, with more of the devastating Colson material put back in. Morgan resisted. This was theater, not history. He was the dramatist; he knew what he was doing. He was focused on cutting, not adding, lines. Back at the theater, after a second reading, Langella took up my argument on his own. Nixon's quick collapse did not feel "emotionally right" to him, he said. He needed more lines. He needed to suffer more. Grandage listened for a while, but the actor's job was not to question the text, but to make the playwright's words work. The play would stay as written. It opened in London on August 10, 2006, to terrific reviews. The critics raved about Langella's performance as Nixon, as well as Michael Sheen's as David Frost. (I tried not to take it personally when the International Herald Tribune critic, Matt Wolf, wrote, "Frost/Nixon provide[s] a snarky guide to [the] proceedings in the form of Elliot Cowan's bespectacled James Reston, Jr.") No one seemed to care about what was historically accurate and what had been made up. No one seemed to find Nixon's breaking down and subsequent contrition unsatisfying. Not even me. Langella had made it work, brilliantly...not through more words, but with shifting eyes, awkward pauses and strange, uncomfortable body language, suggesting a squirming, guilty man. Less had become more as a great actor was forced back on the essential tools of his art. Langella had not impersonated Nixon, but had become a totally original character, inspired by Nixon perhaps, but different from him. Accuracy—at least within the walls of theater—did not seem to matter. Langella's performance evoked, in Aristotelian terms, both pity and fear. No uncertainty lingered about the hero's (or the audience's) epiphany. In April 2007 the play moved to Broadway. Again the critics raved. But deep in his admiring review, the New York Times' Ben Brantley noted, "Mr. Morgan has blithely rejiggered and rearranged facts and chronology" and referred readers to my 1977 manuscript, which had just been published, at last, as The Conviction of Richard Nixon. A few days later, I heard from Morgan. Brantley's emphasis on the play's factual alterations was not helpful, he said. Morgan and I had long disagreed on this issue of artistic license. I regarded it as a legitimate point between two people coming from different value systems. Beyond their historical worth, the 1977 Nixon interviews had been searing psychodrama, made all the more so by the uncertainty over their outcome—and the ambiguity that lingered. I did not think they needed much improving. If they were to be compressed, I thought they should reflect an accurate essence. Morgan's attention was on capturing and keeping his audience. Every line needed to connect to the next, with no lulls or droops in deference to dilatory historical detail. Rearranging facts or lines or chronology was, in his view, well within the playwright's mandate. In his research for the play, different participants had given different, Rashômon-like versions of the same event. "Having met most of the participants and interviewed them at length," Morgan wrote in the London program for the play, "I'm satisfied no one will ever agree on a single, 'true' version of what happened in the Frost/Nixon interviews—thirty years on we are left with many truths or many fictions depending on your point of view. As an author, perhaps inevitably that appeals to me, to think of history as a creation, or several creations, and in the spirit of it all I have, on occasion, been unable to resist using my imagination." In a New York Times article published this past November, Morgan was unabashed about distorting facts. "Whose facts?" he told the Times reporter. Hearing different versions of the same events, he said, had taught him "what a complete farce history is." I emphatically disagreed. No legitimate historian can accept history as a creation in which fact and fiction are equals. Years later participants in historical events may not agree on "a single, 'true' version of what happened," but it's the historian's responsibility to sort out who is telling the truth and who is covering up or merely forgetful. As far as I was concerned, there was one true account of the Frost/Nixon interviews—my own. The dramatist's role is different, I concede, but in historical plays, the author is on the firmest ground when he does not change known facts but goes beyond them to speculate on the emotional makeup of the historical players. But this was not my play. I was merely a resource; my role was narrow and peripheral. Frost/Nixon—both the play and the movie—transcends history. Perhaps it is not even history at all: in Hollywood, the prevailing view is that a "history lesson" is the kiss of commercial death. In reaching for an international audience, one that includes millions unversed in recent American history, Morgan and Ron Howard, the film's director, make the history virtually irrelevant. In the end it is not about Nixon or Watergate at all. It's about human behavior, and it rises upon such tran­scendent themes as guilt and innocence, resistance and enlightenment, confession and redemption. These are themes that straight history can rarely crystallize. In the presence of the playwright's achievement, the historian—or a participant—can only stand in the wings and applaud. James Reston Jr. is the author of The Conviction of Richard Nixon and 12 other books.
f5288d9692b8b8703fac26945daab959
https://www.smithsonianmag.com/history/garry-kasparovs-next-move-180949729/
What is Garry Kasparov’s Next Move?
What is Garry Kasparov’s Next Move? A vast global game of geopolitical chess seemed to be hanging in the balance the morning I met with Garry Kasparov, the Russian chess genius whom many regard as the greatest player of all time. What’s less well known about him is that for the past decade Kasparov has become a major player in that great game of liberty versus tyranny in which the globe is the board. He was jailed and, as recently as 2012, beaten in Moscow for protesting Vladimir Putin’s regime and its crackdown on civil liberties, and he’s been driven out of his homeland. After daring a presidential election challenge to Putin in 2007, one that was disqualified under murky circumstances, and a number of what he calls “accidents,” he no longer feels life and liberty are safe there. Not that his life is necessarily safer anywhere else in the world, as the fate of Russian dissident Alexander Litvinenko—who was poisoned with polonium-laced tea in a posh London hotel in 2006—attests. No tea was served in the mazelike reception lounge area of the large Upper West Side apartment complex where we met. Kasparov, 50, came barreling out of the elevator, a compact fellow with the physique and the no-nonsense mien of a welterweight boxer. He had just returned from the World Chess Championship in India where his former protégé Magnus Carlsen, a then 22-year-old Norwegian prodigy, stunned the world with a smashing victory over the reigning champion, Viswanathan Anand. Kasparov, who became the 13th world champion in 1985 and was ranked number one in the world until he retired in 2005, seems genuinely in awe of Carlsen’s ability: “He has unique chess talents,” says Kasparov, who trained Carlsen for a year back in 2009. “I would say that he is a combination of Bobby Fischer and Anatoly Karpov [the Russian world champion whom Kasparov dethroned]. Because he has Karpov’s precision and ability to just locate the piece’s best positions but also Fischer’s determination. So he can play to the last point, the last moment, last chance, and some people say he’s good at squeezing water out of stone.” Meaning he can see possibilities of victory even in often bleak-looking end-game boards, possibilities that can be obtained only by exploiting minute, nearly invisible positional advantages. In fact, Kasparov believes the Norwegian has so far-outdistanced the rest of the world that he will not be beaten by anyone “for next five years, at least,” although Kasparov thinks an American, Hikaru Nakamura, he had been bringing along may have a chance. Invisible positional advantages are what Kasparov must hope for in the global human rights game he’s playing now. His chief opponent, Putin, has a nuclear arsenal and a much-feared army of intelligence operatives, the FSB, as the successor organization to the KGB is called today. Kasparov’s “invisible” arsenal is moral force, which sometimes—as the recent celebration of Nelson Mandela reminds us—can triumph after years of struggle. But the odds are heavily stacked against him. Kasparov speaks with a kind of Russian stoicism about his entry into politics: “I was not playing to win, it was just something I believed was important for me as a human being. So it’s like a moral imperative rather than coldblooded calculation.” Kasparov is now chairman of the international council of the Human Rights Foundation, an organization identified with Vaclav Havel, one of modern history’s greatest dissidents, whose Velvet Revolution in Czechoslovakia was a landmark in the beginning of the end of the Soviet empire—but not the end of repression in Russia. After a coterie of Harvard-based economic advisers helped engineer the privatizing of Russian state assets in the 1990s to the profit of corrupt oligarchs, the consequent immiseration of the Russian people led to Putin’s rise to power. And that led to Putin’s ongoing attempt to recoup what had been lost—seeking to recapture the states that had separated themselves from the Soviet empire, and to crush democracy within Russia. This very morning it looked as if Putin had pulled off another bold move, what might be called in chess terms, “Putin’s Gambit,” his attempt to recapture Ukraine, the lost queen of the new Russian empire, from the seductive embrace of the West. I show Kasparov the morning’s Wall Street Journal dramatic Page 1 headline: “Ukraine’s Pivot to Moscow Leaves West out in the Cold.” The gist: When it looked as if Ukraine President Viktor Yanukovych was about to sign long-negotiated economic agreements that would bring it closer to membership in the European Union, he was reportedly summoned for a chat with Putin and, not long afterward, announced that he had decided not to sign the agreements. It was widely reported that Putin had used a combination of threats, bribes and economic enticements to lure Ukraine back. As Kasparov and I spoke in New York, halfway across the globe in Kiev, Ukraine’s capital, tens of thousands were converging to protest what they regarded as their being sold back into neo-Soviet satellite status, toppling the statue of Lenin in Kiev’s main square. As I write, there are despairing reports of heavily armed police storming into opposition TV and radio stations. By press time, the violence was intensifying and spreading throughout Ukraine, no endgame in sight. Foreign policy commentators were speaking of this as a decisive moment in post-cold war history. And Garry Kasparov, I came to realize, as he analyzed the news, was viewing the episode in the perspective of the history not just of the past two decades, but of the past century. He sees the contemporary situation as a badly played chess match in which the West lost its chance to press its advantage after the fall of the Soviet Union, instead complacently settling for what looked like a draw—one that now might turn into a decisive loss. What impressed me about Kasparov was how well read and sophisticated in his knowledge of history and international politics he was. Chess genius does not always translate to real-world intelligence (Bobby Fischer ended up as a paranoid Holocaust denier). And Kasparov deplores the tragic depiction of a Russian prodigy in Nabokov’s chess novel, The Defense. He’s deeply learned in history and historical parallels. When the talk turns to the Sochi Olympics, he refers back to the German games of 1936: “The Olympics started four months after Germany [remilitarized the Rhineland], violating Versailles agreement, and within one month after the beginning of the civil war in Spain. Soon German planes were bombing Spanish cities—the Western powers pretended it was business as usual.” “You think the Sochi Olympics is...? “The Sochi Olympics, I think, might be a total disaster, [but] we’re lucky. Because [the difference between] Hitler and Putin is that Putin doesn’t have a proper organization behind him in Russia.” Kasparov’s animus toward Putin led me to ask the philosophical question “Do you believe in evil?” “Everyone has an evil component within,” he tells me. “It’s matter of circumstance whether it emerges. Whether he becomes ‘the right man in the right place at the right time’ for evil to emerge. Stalin had it, all the components in place.” Ron Rosenbaum is the author of seven books of nonfiction, including The Shakespeare Wars: Clashing Scholars, Public Fiascoes, Palace Coups, and How the End Begins: The Road to a Nuclear World War III. An updated edition of his book, Explaining Hitler: The Search for the Origins of His Evil is being published by DaCapo/Perseus Books.
7b62fef5911851450f2ae79a2f4c76e3
https://www.smithsonianmag.com/history/george-friedman-on-world-war-iii-776748/
George Friedman on World War III
George Friedman on World War III George Friedman holds a doctorate in government from Cornell University and is the founder and chief executive of Stratfor, a geopolitical consulting firm in Austin, Texas. His most recent book is The Next 100 Years: A Forecast for the 21st Century. He spoke with Terence Monmaney. [×] CLOSE Video: Q & A: George Friedman Commentators have declared the end of American dominance. You disagree. Why? The 20th century wasn’t the American century. In the first half of the century, the United States was a peripheral player—marginal to what was happening. From 1945 to 1991, the United States was caught in a terrific conflict with the Soviet Union. The United States has been the sole global power [only] since 1991, less than 20 years. People say China is emerging as a power. The U.S. economy is roughly three times larger than China’s. That’s a $10 trillion difference. Twenty-five percent of the world’s economic activity happens in the United States. The U.S. Navy controls all the oceans. We are an order of magnitude more powerful than anyone else. Undermining that kind of power can happen, but it normally takes wars, and it certainly takes generations. You posit a third world war starting in 2050. My expectation is we’re going to see a fragmentation in China because of internal social stresses, and the weakening of Russia. Three powers are emerging on the periphery of Eurasia. One is Japan, which is truly the center of gravity of Asia; it’s the second-largest economy in the world. Unlike China, Japan does not have a billion people living in sub-Saharan-type poverty. It is unified. It has the largest navy in Asia. Second is Turkey, now the 17th- largest economy in the world and the largest Islamic economy. And whenever Islam emerges into a coherent political entity, which it hasn’t done for a century, Turkey is almost invariably at its center. Turkey has by far the most powerful and effective military in Europe and is going to be a major Mediterranean power. The third country is Poland. Few people know that Poland is the 21st-largest economy in the world, the 8th-largest in Europe, and by far the most dynamic. It is also a country very much afraid of Germany and Russia. Russia is right now in the process of rebuilding itself. This makes the Poles very uneasy. The Germans are reaching out to the Russians. Poland feels trapped. Japan is utterly dependent on the sea lanes for the import and export of products. And those sea lanes are controlled by the United States. The United States controls the oceans, and its view is that that is the foundation of its national security. As Japan and Turkey become greater maritime powers, the United States will become hostile toward them. Japan and Turkey each wants to be a maritime power and each sees the U.S. as a threat. Poland has no interest in being a maritime power. It’s afraid of Turkey, and interested in the U.S. There’s a natural coalition. The center of gravity of American military power is in space. Everything from navigation to communication to intelligence satellites operate in space. If any power were to knock out the United States, it would have to knock out those assets. If the Japanese and Turks were to take on the United States, that would be the place they would have to strike first, to blind us, to cripple us. I would expect the war to start there. It seems like science fiction, but one wonders how somebody in 1900 would have felt about a description of what World War II was going to be like. The details may not be as I say—there may be other players, it may not happen in 2050—but every century has a war. The 21st century is not going to be the first century without major warfare. Terence Monmaney is the deputy editor of Smithsonian magazine.
ba617013b55daa323c4a078709505117
https://www.smithsonianmag.com/history/george-jetson-gets-a-check-up-137749/
George Jetson Gets A Check-Up
George Jetson Gets A Check-Up This is the 14th in a 24-part series looking at every episode of “The Jetsons” TV show from the original 1962-63 season. The 14th episode of “The Jetsons” originally aired in the U.S. on December 30, 1962, and was titled “Test Pilot.” This episode (like so many others) centers around the competition between Spacely Sprockets and Cogswell Cogs. Both companies have developed an invincibility suit which can supposedly withstand anything from gigantic sawblades to missiles being fired directly at it. The only trouble is that neither Mr. Spacely nor Mr. Cogswell can find any person brave enough (or dumb enough) to act as a human guinea pig and test the suit’s ability to keep its wearer safe. George goes to the doctor for an insurance physical and gets some bad news. George swallows a Peek-A-Boo Prober Capsule which travels around the inside of his body showing the doctor (in a rather humorous way, of course) how George’s various organs are holding up. “You just swallow it and it transmits pictures to a TV screen,” the doctor explains. Through a series of mix-ups the doctor diagnoses George as having very little time to live. George then takes “live each day as if it were your last” quite literally and begins making hasty decisions — giving his family money to spend frivolously and telling off his boss, Mr. Spacely. Mr. Spacely realizes that George’s newfound bravery may be just what he needs to test out the invincibility suit. Mr. Cogswell tries to poach the newly heroic Jetson for his company since he’s had no more luck than Mr. Spacely in finding a test pilot. Mr. Spacely wins out and George goes on testing the suit without a care in the world, acting rather calm for a man who believes that he’ll soon be six feet under. (Or six feet over? I don’t think “The Jetsons” ever addresses if people of the 21st century are buried or cremated or shot into space or something.) After many death-defying tests, George discovers that the diagnosis was wrong and that he’s not going to die. George then reverts back to the lovable coward he always was and does his best to get out of the last test which just so happens to involve two missiles being shot at him. In the end, it wasn’t the missiles or the sawblades that destroyed the suit, but the washing machine — and George remarks that they should have included a “dry-clean only” tag. The 1950s was an exciting decade for medicine with many important innovations — from Salk’s polio vaccine to the first organ transplant. These incredible advancements led many to believe that such marvelous medical discoveries would continue at an even more accelerated rate into the 21st century, including in how to diagnosis different diseases. As Dr. Kunio Doi explains in his 2006 paper “Diagnostic Imaging Over the Last 50 Years” the science of seeing inside the human body has developed tremendously since the 1950s. The biggest hurdle in diagnostic imaging at mid-century was the manual processing of film which could be time consuming: most diagnostic images were obtained by use of screen-film systems and a high-voltage x-ray generator for conventional projection x-ray imaging . Most radiographs were obtained by manual processing of films in darkrooms, but some of the major hospitals began to use automated film processors. The first automated film processor was a large mechanical system with film hangers, which was designed to replace the manual operation of film development; it was very bulky, requiring a large space, and took about 40 min to process a film. The January 17, 1960 edition of the Sunday comic strip Our New Age by Athelstan Spilhaus offered an optimistic look at the medical diagnostic instruments of the future: The strip explains that one day patients might step into an “examination booth” while outfitted with a suit that measures all kinds of things at once — your heart rate, blood pressure, breathing and so on. This suit will, of course, be connected to a computer which will spit out data to be analyzed by a doctor. The prescription will then be “automatically” printed out for the patient. Just as we see with George Jetson, “automatic” diagnosis in this comic strip from 1960 doesn’t mean that humans will be taken completely out of the picture. Doctors of the future, we were told, will still play a vital role in analyzing information and double-checking the computer’s diagnosis. As Dr. Doi notes in his paper, we’ve made tremendous strides in the last 50 years of diagnosis. But I suppose we’re still waiting on that invincibility suit. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
4f7da1a645e6a640303b50fd85f80336
https://www.smithsonianmag.com/history/george-washington-and-his-maps-72194830/
George Washington and His Maps
George Washington and His Maps First in war. First in peace. First to look at a map whenever he had a question about waging the former and sustaining the latter. It's not how we typically picture George Washington: bent over a map by candlelight, scrutinizing, measuring and in some cases actually drawing the topographical details that would help conquer a wilderness, win a war, create a republic. But as historian Barnet Schecter shows us in his illustrated new history, George Washington's America: a Biography through His Maps, many of our first president's decisions during his long career as a surveyor, soldier and statesman were made only after careful readings of the existing cartographical materials. About 43 of Washington's maps—the actual maps—were saved and bound together, most likely by his family after his death in 1799. Eventually, they made their way to Yale University's Sterling Library. Schecter, a 1985 Yale graduate, read about them in the university's alumni magazine. Intrigued, he went to New Haven to see them and was flabbergasted by their richness—exquisitely rendered, copper-plate engraved, many with additional water color painting. All were from Washington's personal library and (in a stroke of good timing for Schecter) recently cleaned and restored. "I was blown away," says Schecter, the author of the critically-acclaimed books The Revolutionary Battle of New York, and the Civil War Draft Riots. "To hold maps he held sends shivers down the spine." "The Yale atlas enables us to look over Washington's shoulder," Schecter writes in the introduction to his book, "accompanying him as he journeyed through these landscapes, of struggled to direct his generals and monitor their campaigns in distant theaters of battle." Schecter's book examines 190 of the founding father's maps, including the original 43 maps in the Atlas as well as others that appeared in a separate inventory of Washington's library. Here are 10 maps Schecter feels are most important in understanding the significance that maps played in the life of Washington in each phase of his remarkable career. Part of the significance of this map, originally done in 1751, was its creators: Peter Jefferson, Thomas’ father, and Joshua Fry, who commanded George Washington during the French and Indian War. But Schecter suggests that it also maps out the contours of the young Washington’s mind and character. “All the land up to the mountains was owned by people like Lord Fairfax,” Schecter says. “This map sets up one of the great shaping forces of Washington’s life—his search for land beyond the mountains. It shows the acquisitive, ambitious side of the man.” Later, he notes, “That self interested pre-occupation became ‘how do we unify this country?’” Washington found the answer to both of those questions in his maps.
5f2fa5a6ae6545a1ce711a6c2c2fd955
https://www.smithsonianmag.com/history/george-washington-the-reluctant-president-49492/?c=y&page=3
George Washington: The Reluctant President
George Washington: The Reluctant President Editor’s note: Even as the Constitution was being ratified, Americans looked toward a figure of singular probity to fill the new office of the presidency. On February 4, 1789, the 69 members of the Electoral College made George Washington the only chief executive to be unanimously elected. Congress was supposed to make the choice official that March but could not muster a quorum until April. The reason—bad roads—suggests the condition of the country Washington would lead. In a new biography, Washington: A Life, Ron Chernow has created a portrait of the man as his contemporaries saw him. The excerpt below sheds light on the president’s state of mind as the first Inauguration Day approached. The Congressional delay in certifying George Washington’s election as president only allowed more time for doubts to fester as he considered the herculean task ahead. He savored his wait as a welcome “reprieve,” he told his former comrade in arms and future Secretary of War Henry Knox, adding that his “movements to the chair of government will be accompanied with feelings not unlike those of a culprit who is going to the place of his execution.” His “peaceful abode” at Mount Vernon, his fears that he lacked the requisite skills for the presidency, the “ocean of difficulties” facing the country—all gave him pause on the eve of his momentous trip to New York. In a letter to his friend Edward Rutledge, he made it seem as if the presidency was little short of a death sentence and that, in accepting it, he had given up “all expectations of private happiness in this world.” The day after Congress counted the electoral votes, declaring Washington the first president, it dispatched Charles Thomson, the secretary of Congress, to bear the official announcement to Mount Vernon. The legislators had chosen a fine emissary. A well-rounded man, known for his work in astronomy and mathematics, the Irish-born Thomson was a tall, austere figure with a narrow face and keenly penetrating eyes. He couldn’t have relished the trying journey to Virginia, which was “much impeded by tempestuous weather, bad roads, and the many large rivers I had to cross.” Yet he rejoiced that the new president would be Washington, whom he venerated as someone singled out by Providence to be “the savior and father” of the country. Having known Thomson since the Continental Congress, Washington esteemed him as a faithful public servant and exemplary patriot. Around noon on April 14, 1789, Washington flung open the door at Mount Vernon and greeted his visitor with a cordial embrace. Once in the privacy of the mansion, he and Thomson conducted a stiff verbal minuet, each man reading from a prepared statement. Thomson began by declaring, “I am honored with the commands of the Senate to wait upon your Excellency with the information of your being elected to the office of President of the United States of America” by a unanimous vote. He read aloud a letter from Senator John Langdon of New Hampshire, the president pro tempore. “Suffer me, sir, to indulge the hope that so auspicious a mark of public confidence will meet your approbation and be considered as a sure pledge of the affection and support you are to expect from a free and enlightened people.” There was something deferential, even slightly servile, in Langdon’s tone, as if he feared that Washington might renege on his promise and refuse to take the job. Thus was greatness once again thrust upon George Washington. Any student of Washington’s life might have predicted that he would acknowledge his election in a short, self-effacing speech full of disclaimers. “While I realize the arduous nature of the task which is conferred on me and feel my inability to perform it,” he replied to Thomson, “I wish there may not be reason for regretting the choice. All I can promise is only that which can be accomplished by an honest zeal.” This sentiment of modesty jibed so perfectly with Washington’s private letters that it could not have been feigned: he wondered whether he was fit for the post, so unlike anything he had ever done. The hopes for republican government, he knew, rested in his hands. As commander in chief, he had been able to wrap himself in a self-protective silence, but the presidency would leave him with no place to hide and expose him to public censure as nothing before. Because the vote counting had been long delayed, Washington, 57, felt the crush of upcoming public business and decided to set out promptly for New York on April 16, accompanied in his elegant carriage by Thomson and aide David Humphreys. His diary entry conveys a sense of foreboding: “About ten o’clock, I bade adieu to Mount Vernon, to private life, and to domestic felicity and, with a mind oppressed with more anxious and painful sensations than I have words to express, set out for New York...with the best dispositions to render service to my country in obedience to its call, but with less hope of answering its expectations.” Waving goodbye was Martha Washington, who wouldn’t join him until mid-May. She watched her husband of 30 years depart with a mixture of bittersweet sensations, wondering “when or whether he will ever come home again.” She had long doubted the wisdom of this final act in his public life. “I think it was much too late for him to go into public life again,” she told her nephew, “but it was not to be avoided. Our family will be deranged as I must soon follow him.” Determined to travel rapidly, Washington and his entourage set out each day at sunrise and put in a full day on the road. Along the way he hoped to keep ceremonial distractions to a minimum, but he was soon disabused: eight exhausting days of festivities lay ahead. He had only traveled ten miles north to Alexandria when the townspeople waylaid him with a dinner, lengthened by the mandatory 13 toasts. Adept at farewells, Washington was succinctly eloquent in response. “Unutterable sensations must then be left to more expressive silence, while, from an aching heart, I bid you all, my affectionate friends and kind neighbors, farewell.” Before long, it was apparent that Washington’s journey would form the republican equivalent of the procession to a royal coronation. As if already a seasoned politician, he left a trail of political promises in his wake. While in Wilmington, he addressed the Delaware Society for Promoting Domestic Manufacturers and imparted a hopeful message. “The promotion of domestic manufactures will, in my conception, be among the first consequences which may naturally be expected to flow from an energetic government.” Arriving in Philadelphia, he was met by local dignitaries and asked to mount a white horse for his entry into town. When he crossed a bridge over the Schuylkill, it was wreathed with laurels and evergreens, and a cherubic boy, aided by a mechanical device, lowered a laurel crown over his head. Recurrent cries of “Long Live George Washington” confirmed what his former aide James McHenry had already told him before he left Mount Vernon: “You are now a king under a different name.” As Washington entered Philadelphia, he found himself, willy-nilly, at the head of a full-scale parade, with 20,000 people lining the streets, their eyes fixed on him in wonder. “His Excellency rode in front of the procession, on horseback, politely bowing to the spectators who filled the doors and windows by which he passed,” reported the Federal Gazette, noting that church bells rang as Washington proceeded to his old haunt, the City Tavern. After the bare-knuckled fight over the Constitution, the newspaper editorialized, Washington had united the country. “What a pleasing reflection to every patriotic mind, thus to see our citizens again united in their reliance on this great man who is, a second time, called upon to be the savior of his country!” By the next morning, Washington had grown tired of the jubilation. When the light horse cavalry showed up to accompany him to Trenton, they discovered he had left the city an hour earlier “to avoid even the appearance of pomp or vain parade,” reported one newspaper. As Washington approached the bridge over Assunpink Creek in Trenton, the spot where he had stood off the British and Hessians, he saw that the townsfolk had erected a magnificent floral arch in his honor and emblazoned it with the words “December 26, 1776” and the proclamation “The Defender of the Mothers will also Defend the Daughters.” As he rode closer, 13 young girls, robed in spotless white, walked forward with flower-filled baskets, scattering petals at his feet. Astride his horse, tears standing in his eyes, he returned a deep bow as he noted the “astonishing contrast between his former and actual situation at the same spot.” With that, three rows of women—young girls, unmarried ladies and married ones—burst into a fervent ode on how he had saved fair virgins and matrons alike. The adulation only quickened Washington’s self-doubt. “I greatly apprehend that my countrymen will expect too much from me,” he wrote to Rutledge. “I fear, if the issue of public measures should not correspond with their sanguine expectations, they will turn the extravagant...praises which they are heaping upon me at this moment into equally extravagant...censures.” There was no way, it seemed, that he could dim expectations or escape public reverence. By now sated with adulation, Washington preserved a faint hope that he would be allowed to make an inconspicuous entry into New York. He had pleaded with Gov. George Clinton to spare him further hoopla: “I can assure you, with the utmost sincerity, that no reception can be so congenial to my feelings as a quiet entry devoid of ceremony.” But he was fooling himself if he imagined he might slip unobtrusively into the temporary capital. Never reconciled to the demands of his celebrity, Washington still fantasized that he could shuck that inescapable burden. When he arrived at Elizabethtown, New Jersey, on April 23, he beheld an impressive phalanx of three senators, five congressmen and three state officials awaiting him. He must have intuited, with a sinking sensation, that this welcome would eclipse even the frenzied receptions in Philadelphia and Trenton. Moored to the wharf was a special barge, glistening with fresh paint, constructed in his honor and equipped with an awning of red curtains in the rear to shelter him from the elements. To nobody’s surprise, the craft was steered by 13 oarsmen in spanking white uniforms. As the barge drifted into the Hudson River, Washington made out a Manhattan shoreline already “crowded with a vast concourse of citizens, waiting with exulting anxiety his arrival,” a local newspaper said. Many ships anchored in the harbor were garlanded with flags and banners for the occasion. If Washington gazed back at the receding Jersey shore, he would have seen that his craft led a huge flotilla of boats, including one bearing the portly figure of Gen. Henry Knox. Some boats carried musicians and female vocalists on deck, who serenaded Washington across the waters. “The voices of the ladies were...superior to the flutes that played with the stroke of the oars in Cleopatra’s silken-corded barge,” was the imaginative verdict of the New York Packet. These wafted melodies, united with repeated cannon roar and thunderous acclaim from crowds onshore, again oppressed Washington with their implicit message of high expectations. As he confided to his diary, the intermingled sounds “filled my mind with sensations as painful (considering the reverse of this scene, which may be the case after all my labors to do good) as they are pleasing.” So as to guard himself against later disappointment, he didn’t seem to allow himself the smallest iota of pleasure. When the presidential barge landed at the foot of Wall Street, Governor Clinton, Mayor James Duane, James Madison and other luminaries welcomed him to the city. The officer of a special military escort stepped forward briskly and told Washington that he awaited his orders. Washington again labored to cool the celebratory mood, which burst forth at every turn. “As to the present arrangement,” he replied, “I shall proceed as is directed. But after this is over, I hope you will give yourself no further trouble, as the affection of my fellow-citizens is all the guard I want.” Nobody seemed to take the hint seriously. The streets were solidly thronged with well-wishers and it took Washington a half-hour to arrive at his new residence at 3 Cherry Street, tucked away in the northeast corner of the city, a block from the East River, near the present-day Brooklyn Bridge. One week earlier, the building’s owner, Samuel Osgood, had agreed to allow Washington to use it as the temporary presidential residence. From the descriptions of Washington’s demeanor en route to the house, he finally surrendered to the general mood of high spirits, especially when he viewed the legions of adoring women. As New Jersey Representative Elias Boudinot told his wife, Washington “frequently bowed to the multitude and took off his hat to the ladies at the windows, who waved their handkerchiefs and threw flowers before him and shed tears of joy and congratulation. The whole city was one scene of triumphal rejoicing.” Though the Constitution said nothing about an inaugural address, Washington, in an innovative spirit, contemplated such a speech as early as January 1789 and asked a “gentleman under his roof”—David Humphreys—to draft one. Washington had always been economical with words, but the collaboration with Humphreys produced a wordy document, 73 pages long, which survives only in tantalizing snippets. In this curious speech, Washington spent a ridiculous amount of time defending his decision to become president, as if he stood accused of some heinous crime. He denied that he had accepted the presidency to enrich himself, even though nobody had accused him of greed. “In the first place, if I have formerly served the community without a wish for pecuniary compensation, it can hardly be suspected that I am at present influenced by avaricious schemes.” Addressing a topical concern, he disavowed any desire to found a dynasty, citing his childless state. Closer in tone to future inaugural speeches was Washington’s ringing faith in the American people. He devised a perfect formulation of popular sovereignty, writing that the Constitution had brought forth “a government of the people: that is to say, a government in which all power is derived from, and at stated periods reverts to, them—and that, in its operation...is purely a government of laws made and executed by the fair substitutes of the people alone.” This ponderous speech never saw the light of day. Washington sent a copy to James Madison, who wisely vetoed it on two counts: that it was much too long and that its lengthy legislative proposals would be interpreted as executive meddling with the legislature. Instead, Madison helped Washington draft a far more compact speech that avoided the tortured introspection of its predecessor. A whirlwind of energy, Madison would seem omnipresent in the early days of Washington’s administration. Not only did he help draft the inaugural address, he also wrote the official response by Congress and then Washington’s response to Congress, completing the circle. This established Madison, despite his role in the House, as a pre-eminent adviser and confidant to the new president. Oddly enough, he wasn’t troubled that his advisory relationship to Washington might be construed as violating the separation of powers. Washington knew that everything he did at the swearing-in would establish a tone for the future. “As the first of everything in our situation will serve to establish a precedent,” he reminded Madison, “it is devoutly wished on my part that these precedents may be fixed on true principles.” He would shape indelibly the institution of the presidency. Although he had earned his reputation in battle, he made a critical decision not to wear a uniform at the inauguration or beyond, banishing fears of a military coup. Instead, he would stand there aglitter with patriotic symbols. To spur American manufactures, he would wear a double-breasted brown suit, made from broadcloth woven at the Woolen Manufactory of Hartford, Connecticut. The suit had gilt buttons with an eagle insignia on them; to round out his outfit, he would wear white hosiery, silver shoe buckles and yellow gloves. Washington already sensed that Americans would emulate their presidents. “I hope it will not be a great while before it will be unfashionable for a gentleman to appear in any other dress,” he told his friend the Marquis de Lafayette, referring to his American attire. “Indeed, we have already been too long subject to British prejudices.” To burnish his image further on Inauguration Day, Washington would powder his hair and wear a dress sword on his hip, sheathed in a steel scabbard. The inauguration took place at the building at Wall and Nassau streets that had long served as New York’s City Hall. It came richly laden with historical associations, having hosted John Peter Zenger’s trial in 1735, the Stamp Act Congress of 1765 and the Confederation Congress from 1785 to 1788. Starting in September 1788, the French engineer Pierre-Charles L’Enfant had remodeled it into Federal Hall, a suitable home for Congress. L’Enfant introduced a covered arcade at street level and a balcony surmounted by a triangular pediment on the second story. As the people’s chamber, the House of Representatives was accessible to the public, situated in a high-ceilinged octagonal room on the ground floor, while the Senate met in a second-floor room on the Wall Street side, buffering it from popular pressure. From this room Washington would emerge onto the balcony to take the oath of office. In many ways, the first inauguration was a hasty, slapdash affair. As with all theatrical spectacles, rushed preparations and frantic work on the new building continued until a few days before the event. Nervous anticipation spread through the city as to whether the 200 workmen would complete the project on time. Only a few days before the inauguration, an eagle was hoisted onto the pediment, completing the building. The final effect was stately: a white building with a blue and white cupola topped by a weather vane. A little after noon on April 30, 1789, following a morning filled with clanging church bells and prayers, a contingent of troops on horseback, accompanied by carriages loaded with legislators, stopped at Washington’s Cherry Street residence. Escorted by David Humphreys and aide Tobias Lear, the president-elect stepped into his appointed carriage, which was trailed by foreign dignitaries and throngs of joyous citizens. The procession wound slowly through the narrow Manhattan streets, emerging 200 yards from Federal Hall. After alighting from his carriage, Washington strode through a double line of soldiers to the building and mounted to the Senate chamber, where members of Congress awaited him expectantly. As he entered, Washington bowed to both houses of the legislature—his invariable mark of respect—then occupied an imposing chair up front. A profound hush settled on the room. Vice President John Adams rose for an official greeting, then informed Washington that the epochal moment had arrived. “Sir, the Senate and House of Representatives are ready to attend you to take the oath required by the Constitution.” “I am ready to proceed,” Washington replied. As he stepped through the door onto the balcony, a spontaneous roar surged from the multitude tightly squeezed into Wall and Broad streets and covering every roof in sight. This open-air ceremony would confirm the sovereignty of the citizens gathered below. Washington’s demeanor was stately, modest and deeply affecting: he clapped one hand to his heart and bowed several times to the crowd. Surveying the serried ranks of people, one observer said they were jammed so closely together “that it seemed one might literally walk on the heads of the people.” Thanks to his simple dignity, integrity and unrivaled sacrifices for his country, Washington’s conquest of the people was complete. A member of the crowd, the Count de Moustier, the French minister, noted the solemn trust between Washington and the citizens who stood packed below him with uplifted faces. As he reported to his government, never had a “sovereign reigned more completely in the hearts of his subjects than did Washington in those of his fellow citizens...he has the soul, look and figure of a hero united in him.” One young woman in the crowd echoed this when she remarked, “I never saw a human being that looked so great and noble as he does.” Only Congressman Fisher Ames of Massachusetts noted that “time has made havoc” upon Washington’s face, which already looked haggard and careworn. The sole constitutional requirement for the swearing-in was that the president take the oath of office. That morning, a Congressional committee decided to add solemnity by having Washington place his hand on a Bible during the oath, leading to a frantic, last-minute scramble to locate one. A Masonic lodge came to the rescue by providing a thick Bible, bound in deep brown leather and set on a crimson velvet cushion. By the time Washington appeared on the portico, the Bible rested on a table draped in red. The crowd grew silent as New York Chancellor Robert R. Livingston administered the oath to Washington, who was visibly moved. As the president finished the oath, he bent forward, seized the Bible and brought it to his lips. Washington felt this moment from the bottom of his soul: one observer noted the “devout fervency” with which he “repeated the oath and the reverential manner in which he bowed down and kissed” the Bible. Legend has it that he added, “So help me God,” though this line was first reported 65 years later. Whether or not Washington actually said it, very few people would have heard him anyway, since his voice was soft and breathy. For the crowd below, the oath of office was enacted as a kind of dumb show. Livingston had to lift his voice and inform the crowd, “It is done.” He then intoned: “Long live George Washington, president of the United States.” The spectators responded with huzzahs and chants of “God bless our Washington! Long live our beloved president!” They celebrated in the only way they knew, as if greeting a new monarch with the customary cry of “Long live the king!” When the balcony ceremony was concluded, Washington returned to the Senate chamber to deliver his inaugural address. In an important piece of symbolism, Congress rose as he entered, then sat down after Washington bowed in response. In England, the House of Commons stood during the king’s speeches; the seated Congress immediately established a sturdy equality between the legislative and executive branches. As Washington began his speech, he seemed flustered and thrust his left hand in his pocket while turning the pages with a trembling right hand. His weak voice was barely audible in the room. Fisher Ames evoked him thus: “His aspect grave, almost to sadness; his modesty, actually shaking; his voice deep, a little tremulous, and so low as to call for close attention.” Those present attributed Washington’s low voice and fumbling hands to anxiety. “This great man was agitated and embarrassed more than ever he was by the leveled cannon or pointed musket,” said Pennsylvania Senator William Maclay in sniggering tones. “He trembled and several times could scarce make out to read, though it must be supposed he had often read it before.” Washington’s agitation might have arisen from an undiagnosed neurological disorder or might simply have been a bad case of nerves. The new president had long been famous for his physical grace, but the sole gesture he used for emphasis in his speech seemed clumsy—“a flourish with his right hand,” said Maclay, “which left rather an ungainly impression.” For the next few years, Maclay would be a close, unsparing observer of the new president’s nervous quirks and tics. In the first line of his inaugural address, Washington expressed anxiety about his fitness for the presidency, saying that “no event could have filled me with greater anxieties” than the news brought to him by Charles Thomson. He had grown despondent, he said candidly, as he considered his own “inferior endowments from nature” and his lack of practice in civil government. He drew comfort, however, from the fact that the “Almighty Being” had overseen America’s birth. “No people can be bound to acknowledge and adore the invisible hand, which conducts the affairs of men, more than the people of the United States.” Perhaps referring obliquely to the fact that he suddenly seemed older, he called Mount Vernon “a retreat which was rendered every day more necessary, as well as more dear to me, by the addition of habit to inclination and of frequent interruptions in my health to the gradual waste committed on it by time.” In the earlier inaugural address drafted with David Humphreys, Washington had included a disclaimer about his health, telling how he had “prematurely grown old in the service of my country.” Setting the pattern for future inaugural speeches, Washington didn’t delve into policy matters, but trumpeted the big themes that would govern his administration, the foremost being the triumph of national unity over “local prejudices or attachments” that might subvert the country or even tear it apart. National policy needed to be rooted in private morality, which relied on the “eternal rules of order and right” ordained by heaven itself. On the other hand, Washington refrained from endorsing any particular form of religion. Knowing how much was riding on this attempt at republican government, he said that “the sacred fire of liberty, and the destiny of the republican model of government, are justly considered as deeply, perhaps as finally staked, on the experiment entrusted to the hands of the American people.“ After this speech, Washington led a broad procession of delegates up Broadway, along streets lined by armed militia, to an Episcopal prayer service at St. Paul’s Chapel, where he was given his own canopied pew. After these devotions ended, Washington had his first chance to relax until the evening festivities. That night Lower Manhattan was converted into a shimmering fairyland of lights. From the residences of Chancellor Livingston and General Knox, Washington observed the fireworks at Bowling Green, a pyrotechnic display that flashed lights in the sky for two hours. Washington’s image was displayed in transparencies hung in many windows, throwing glowing images into the night. This sort of celebration, ironically, would have been familiar to Washington from the days when new royal governors arrived in Williamsburg and were greeted by bonfires, fireworks and illuminations in every window. Excerpted from Washington: A Life. Copyright © Ron Chernow. With the permission of the publisher, The Penguin Press, a member of Penguin Group (USA) Inc.
a24fb8c00615ec168512b96269dba7ff
https://www.smithsonianmag.com/history/george-washington-was-great-dad-too-180975139/
The Father of the Nation, George Washington Was Also a Doting Dad to His Family
The Father of the Nation, George Washington Was Also a Doting Dad to His Family George Washington is often described as childless, which is true, but only in the strictly biological definition. When I started digging into his archives, I was surprised to see that in reality, he was raising children from his late 20s until the day he died. When Washington met Martha Custis, she was a wealthy widow with a young daughter and son, and when they married, he became the legal guardian to Patsy and Jacky Custis. Washington’s letters and ledgers indicate that he spent significant time and money (though he often reimbursed himself from the Custis estate) making sure the children were happy, healthy and well educated. His youth had been defined by relative struggle and deprivation, and he wanted them to have the very best of everything. Instead, Washington the father was often heartbroken or frustrated. Patsy was likely epileptic, and no doctor or tincture or hot spring he found cured her, while Jacky, who was set to inherit the majority of his late father’s vast estate, preferred gambling and horses to hard work. The Washingtons had buried both by end of the Revolution, but they played an active role in his widow’s life, even after she remarried, and raised Nelly and Wash, his two youngest children, making them de facto “First Children.” Washington also played father to a rotating cast that included Jacky’s other children, Eliza and Martha, nieces and nephews and, for over a year, the Marquis de Lafayette’s son. All of them, in many ways, were his children. So why don’t we know more about Washington as a family man, and what became of the children he raised after his death? I knew the importance put on biological children was somewhat to blame, but it wasn’t until a few years ago, when I got to know historian Cassandra Good, author of Founding Friendships, that I learned it was about more than just blood ties. We became friends on Twitter, as so many historians do, emailing and talking on the phone and, most recently, spending the night down the hall from each other at Mount Vernon, Washington’s historic home. All the while, Cassie has tortured me by teasing out bits of her ambitious and unprecedented research project on the Washington-Custis family, each one more interesting than the last. A finished book is still a couple of years off, but I managed to convince her to give us a sneak peek for Father’s Day. In George Washington’s letters to his children, wards, and grandchildren, his feelings are palpable. He’s annoyed, he’s encouraging... The letters he writes to Eliza and Nelly [two of Jacky’s daughters] about courtship, where he really talks about what you should be looking for in a partner, would be surprising to people who picture him as this stern, grey haired guy. He is emotional and can talk to them at that level. He basically says to Eliza, “I’m giving you the advice I would give to my own daughter because that’s how I see you.” And she refers to them “as those who nature made my parents.” Even though she has a mother and a stepfather, she sees George and Martha as her parents. Do you think he was different with boys and girls? It seemed to me like he was more demanding of the boys and more emotional with the girls. I think that’s absolutely true. More was expected of boys, they have certain responsibilities, they have to get an education. Jacky and Wash were not very good students and were not particularly focused. We can’t really speak for Patsy because she died so young, but his granddaughters are all quite articulate, well-educated, fairly fiery women who were pretty politically engaged. That has to come, in part, from George and Martha I love that Washington’s courtship advice is more or less warning his granddaughters against his younger self, when he was trying to marry up, marry rich, marry quick. Do you think he was conscious of that, or that his advice was more a reflection of who they were as people? Eliza was impulsive. Nelly was known as sort of flighty. I think he was trying to get them to think more seriously about the kind of commitment they were making; the choice of who to marry at this point is the biggest decision a woman will make in her life because that’s going to determine pretty much the course of the rest of her life. And he knows that. He’s been lucky enough, too, while he married mostly for a position, to have ended up in a very loving marriage. He wants them to be careful. Who do you think was his favorite child? Well, Eliza always said that she was his favorite child, and I find that kind of funny because...it's not that he disliked her, but it seems clear to me that he had a special place in his heart for Nelly. There’s all sorts of accounts from Nelly, and from her friends, that she could make him laugh even when he was in a stern mood. They had an especially close bond. And who do you think he disliked? I don't think he disliked any of them, but I do think he got incredibly frustrated with Wash. I think he would have continued to have been disappointed in the decisions Wash made into adulthood. It’s an interesting situation. In my book, I write that George couldn’t give them what they really needed, which was adversity, but he keeps trying! When he raises Wash and sees, oh, I’m in the same situation again, and I can use what I learned the first time around... Well, he was able to keep Wash from marrying somebody way too young, which he failed to do the first time around with Jacky. Who knows how serious that love interest was, but at least Wash waited quite a long time after that to get married. So, you know, he had one win! But I'm sure he was quite frustrated. But that is not unique to George Washington. If you look at the other men of the Revolutionary era, pretty much all of them, especially in the South, have sons that are just not living up to their ideals. None of them are as serious. None of them are committed to public service. A bunch of them are involved in gambling or drinking or just losing huge amounts of money. By those measures, Wash is not so bad! The founders, with the exception of John Adams, ended up with a lot of ne’er-do-well sons. Come to think of it, Washington was very fond of John Quincy Adams. He promoted him. He seemed to give him a lot of attention. I don’t want to say there’s jealousy... There’s an unpublished letter from 1822, where John [Adams] has told [his son John Quincy's wife] Louisa Catherine that he and George Washington were hoping John Quincy would marry Nelly. Washington never says anything about this, but given what you're saying about how he felt about John Quincy, it makes some sense that he would want a man like that to marry Nelly. Now, there’s never anything between them. In fact, Nelly, throughout her life, hates John and John Quincy. She loathes them out of proportion to any rational reason. Maybe it was in part because she had some inkling that they wanted to set her up. After Washington died, did the world consider the Custis grandchildren his heirs? The Custis grandchildren did everything possible to make sure that the rest of the country knew that they were Washington's heirs. Not in any technical or legal sense, because while he gives them a few things in his will, Mount Vernon goes to a nephew [one of his brothers’ sons] Bushrod Washington. The Custis kids had so much already from [Jacky’s] estate, so there's no reason that George Washington needs to give them much. But he does say, in his will, I've committed to treating them like my own children, and so I'm giving them some things, like Nelly gets land. But [the Custis kids] buy the rest at the estate sale after Washington’s death; they're the ones that have the goods to display. Also, the younger two [Nelly and Wash] are in this famous portrait called “The Family of Washington” by Edward Savage, which gets made into prints and is incredibly popular. So a lot of Americans just know who are because they have this thing hanging in their house. They're celebrities in that sense, and they keep working at that as they get older to make sure, whether it's giving speeches or giving gifts to be reported in the newspaper to remind people that they are the children of Washington. If it was the 19th century and I saw the Custis name somewhere, I say, oooh, those are George Washington's heirs! Yes, people knew who they were; they always refer to Wash as the adopted son of Washington, so they emphasize, okay, these people are not blood related but we know that they are his children. And it was known that [Custis] was Martha's last name before she married George. People didn't know as much who the actual blood related Washingtons were. They sort of knew who Bushrod was, but he was very careful not to pin his name to George. His obituary doesn't even mention he was George Washington's nephew, so he wants to have his own identity, and he also never had the kind of relationship with George that the Custis kids did. He was never living in the president's house; he's not in a family portrait with him. Bushrod probably wanted the obituary to focus on his own accomplishments, like serving as a justice on the Supreme Court, whereas the Custis kids...Do you think they emphasized their connection to Washington in order to protect his legacy or further their own position in American? It's a combination of those things. If you were to ask them, they would say it's important to protect his legacy, not just as a sort of abstract memory, but his political ideals. But I also think, whether they would have admitted it or not, it was about power for them. These are people who are already a part of the elite, but none of them have personal accomplishments or the kind of civil service that would really make them prominent. They would have just been ordinary cash-poor land-rich, lots of enslaved labor, Virginia planters, if it was not for their relationship to George Washington. And I think they knew that, and they wanted to use Washington as a way to keep them connected to the political scene. They had grown up being celebrities and being connected to political power, and they don't want to let that go. Is there an instance in which they use Washington’s name or his legacy in a way that you felt he would have really disliked? Or that seemed a little too opportunistic? [Chuckles.] I think there are a lot of examples of that! For instance, Martha Custis Peter sends George Washington's gorget [a symbolic remnant of armor worn around the throat] —and this is actually the actual gorget that he wore as part of the British military in Virginia, before the Revolution—to this Federalist Group in Boston at the height of the War 1812. The Federalists are very against the war, to the point that they're starting to think of splitting off into another country. And [the Custises] never go [to Boston], but she's sending this and saying “I approve of your political ideals.” And then the newspapers say “We're so glad that the Washington family approves of what we're doing.” I don't know that George Washington would have exactly been thrilled with the hyper-partisan, against-the-national-government sentiment of some of these Boston Federalists. Look at what [the Custises] do with slavery. Washington does not actually do as much as he could have in terms of slavery, but he has this legacy where the anti-slavery people point to him in the 19th century and say, look, he freed the slaves. We have to remember he does that in part because he's not going to hurt anybody financially. If you look at most of the people in Virginia who actually free their slave labor, at their death, they don't have biological children who would lose money on this action. I think George Washington may have made a different calculation if Wash Custis didn’t already have a lot of slave labor from his father. He’s not hurting anybody in doing this. Certainly not his legacy. Whereas Wash goes full pro-slavery. In 1836, Wash gives a speech and says this is a white man's country. George Washington's actions may have sort of reflected that, but I don't think he would have said it. No, definitely not. Were there any disadvantages to being related to Washington for his heirs? As with the other founding fathers’ children, there are high expectations for this next generation. And in some ways, these high expectations are too much. Wash was a perpetual disappointment to some people, just as his father had been. People make fun of him all the time. One person calls him that “irascible little gentlemen.” They compare him to George and, you know, most people are going to suffer in comparison. Since he's hitching his star to George all the time, it's pretty easy to say this guy is kind of ridiculous comparatively. He does paintings and puts on plays [about Washington] that are kind of mediocre. But for his sisters, I don't think there was much downside for them. There's always that guy who's going to say it. Oh, yeah, and even when Wash is going to make a speech at the dedication of the Mary Washington [Washington’s mother] memorial, Nelly writes to his wife and says, I hope he doesn't say anything that makes the newspapers make fun of them. If that was Washington, he would simply stop making those speeches. Wash has none of his grandfather’s restraints and gravity. He gives these over-the-top, passionate speeches—and they’re always about his relationship to George Washington. I get a lot of questions about Washington and slavery, and in particular, people ask me if Washington, “had children out of wedlock like Jefferson.” The answer is technically no, because he was likely sterile, but given the “like Jefferson,” they were actually asking me if Washington had non-consensual relationships with enslaved women. We don’t know, but there’s been plenty to implicate Wash, right? The evidence we have right now is strongest for a woman named Maria Syphax. Genealogists and researchers are looking for this evidence, but she's born around 1803 or 1804, right around when Wash gets married. Syphax is later given around 17 acres of Arlington, his estate. There’s no legal deed, but Congress recognizes her claim to that land and gives it to her. So there’s recognition. And she says in a newspaper article in the 1880s, when she's an old woman, that Wash told her to her face that she was his daughter. There's also a family story that when she got married, that they were married in the house. And Wash frees her and her children. He also frees close to a dozen other children. How many of those are his? Hard to know. There may be another line who comes from [enslaved worker] Caroline Branham, who would have been fair amount older than Wash was, and was in the room when George Washington died at Mount Vernon. Her descendants are alive and around today and researching their connection. It seems fairly clear that African American descendants of Martha Washington [through her grandson Wash] are around today. Alexis Coe is the New York Times bestselling author of You Never Forget Your First: A Biography of George Washington.
791ae56b30fb2dd4cc13cbbe4ba28be4
https://www.smithsonianmag.com/history/german-pows-on-the-american-homefront-141009996/
German POWs on the American Homefront
German POWs on the American Homefront In the mid-1940s when Mel Luetchens was a boy on his family’s Murdock, Nebraska, farm where he still lives, he sometimes hung out with his father’s hired hands, “I looked forward to it,” he said. “They played games with us and brought us candy and gum.” The hearty young men who helped his father pick corn or put up hay or build livestock fences were German prisoners of war from a nearby camp. “They were the enemy, of course,” says Luetchens, now 70 and a retired Methodist minister. “But at that age, you don’t know enough to be afraid.” Since President Obama’s vow to close the Guantanamo Bay Detention Camp erupted into an entrenched debate about where to relocate the prisoners captured in the Afghanistan War, Luetchens has reflected on the “irony and parallel” of World War II POWs and Guantanamo inmates. Recently, the Senate overwhelmingly rejected providing funds to close the U.S. military prison in Cuba, saying that no community in America would want terrorism suspects in its backyard. But in America’s backyards and farm fields and even dining rooms is where many enemy prisoners landed nearly 70 years ago. As World War II raged, Allies, such as Great Britain, were running short of prison space to house POWs. From 1942 through 1945, more than 400,000 Axis prisoners were shipped to the United States and detained in camps in rural areas across the country. Some 500 POW facilities were built, mainly in the South and Southwest but also in the Great Plains and Midwest. At the same time that the prison camps were filling up, farms and factories across America were struggling with acute labor shortages. The United States faced a dilemma. According to Geneva Convention protocols, POWs could be forced to work only if they were paid, but authorities were afraid of mass escapes that would endanger the American people. Eventually, they relented and put tens of thousands of enemy prisoners to work, assigning them to canneries and mills, to farms to harvest wheat or pick asparagus, and just about any other place they were needed and could work with minimum security. About 12,000 POWs were held in camps in Nebraska. “They worked across the road from us, about 10 or 11 in 1943,” recalled Kelly Holthus, 76, of York, Nebraska. “They stacked hay. Worked in the sugar beet fields. Did any chores. There was such a shortage of labor.” “A lot of them were stone masons,” said Keith Buss, 78, who lives in Kansas and remembers four POWs arriving at his family’s farm in 1943. “They built us a concrete garage. No level, just nail and string to line the building up. It’s still up today.” Don Kerr, 86, delivered milk to a Kansas camp. “I talked to several of them,” he said. “I thought they were very nice.” “At first there was a certain amount of apprehension,” said Tom Buecker, the curator of the Fort Robinson Museum, a branch of the Nebraska Historical Society. “People thought of the POWs as Nazis. But half of the prisoners had no inclination to sympathize with the Nazi Party.” Fewer than 10 percent were hard-core ideologues, he added. Any such anxiety was short-lived at his house, if it existed at all, said Luetchens. His family was of German ancestry and his father spoke fluent German. “Having a chance to be shoulder-to-shoulder with [the prisoners], you got to know them,” Luetchens said. “They were people like us.” “I had the impression the prisoners were happy to be out of the war,” Holthus said, and Kerr recalled that one prisoner “told me he liked it here because no one was shooting at him.” Life in the camps was a vast improvement for many of the POWs who had grown up in “cold water flats” in Germany, according to former Fort Robinson, Nebraska, POW Hans Waecker, 88, who returned to the United States after the war and is now a retired physician in Georgetown, Maine. “Our treatment was excellent. Many POWs complained about being POWs—no girlfriends, no contact with family. But the food was excellent and clothing adequate.” Such diversions as sports, theater, chess games and books made life behind barbed wire a sort of “golden cage,” one prisoner remarked. Farmers who contracted for POW workers usually provided meals for them and paid the U.S. government 45 cents an hour per laborer, which helped offset the millions of dollars needed to care for the prisoners. Even though a POW netted only 80 cents a day for himself, it provided him with pocket money to spend in the canteen. Officers were not required to work under the Geneva Convention accords, which also prohibited POWs from working in dangerous conditions or in tasks directly related to the war effort. “There were a few cases when prisoners told other prisoners not to work so hard,” said historian Lowell May, author of Camp Concordia: German POWs in the Midwest. Punishment for such work slowdowns was usually several days of confinement with rations of only bread and water. “One prisoner at Camp Concordia said a good German would not help the Americans,” May said. “He was sent to a camp for Nazi supporters in Alva, Oklahoma.” Of the tens of thousands of POWs in the United States during World War II, only 2,222, less than 1 percent, tried to escape, and most were quickly rounded up. By 1946, all prisoners had been returned to their home countries. The deprivations of the postwar years in Europe were difficult for the repatriated men. The Luetchens, who established a “lively” letter exchange with their POW farmhands, sent them food and clothing. Eventually Luetchen and his parents visited some of them in Germany. Recently Luetchens considered those experiences in the context of current controversies about Guantanamo detainees. “It was less scary then,” he concluded, but he expressed hope for understanding others, even your designated enemies. “When you know people as human beings up close and understand about their lives, it really alters your view of people and the view of your own world.”
fac789ff78b8da451a9c79eeebe9dab8
https://www.smithsonianmag.com/history/ghosts-my-lai-180967497/
William Laws Calley Jr. was never really meant to be an officer in the U.S. Army. After getting low grades and dropping out of Palm Beach Junior College, he tried to enlist in 1964, but was rejected because of a hearing defect. Two years later, with the escalation in Vietnam, standards for enrollees changed and Calley—neither a valedictorian nor a troublemaker, just a fairly typical American young man trying to figure out what to do with his life—was called up. Before the decade was over Second Lieutenant Calley would become one of the most controversial figures in the country, if not the world. On March 16, 1968, during a roughly four-hour operation in the Vietnamese village of Son My, American soldiers killed approximately 504 civilians, including pregnant women and infants, gang-raped women and burned a village to ashes. Calley, though a low-ranking officer in Charlie Company, stood out because of the sheer number of civilians he was accused of killing and ordering killed. The red-haired Miami native known to friends as Rusty became the face of the massacre, which was named after one of the sub-hamlets where the killings took place, My Lai 4. His story dominated headlines, along with the Apollo 12 moon landing and the trial of Charles Manson. His case became a kind of litmus test for American values, a question not only of who was to blame for My Lai, but how America should conduct war and what constitutes a war crime. Out of the roughly 200 soldiers who were dropped into the village that day, 24 were later charged with criminal offenses, and only one was convicted, Calley. He was set free after serving less than four years. Since that time, Calley has almost entirely avoided the press. Now 74 years old, he declined to be interviewed for this story. But I was able to piece together a picture of his life and legacy by reviewing court records and interviewing his fellow soldiers and close friends. I traveled to Son My, where survivors are still waiting for him to come back and make amends. And I visited Columbus, Georgia, where Calley lived for nearly 30 years. I wanted to know whether Calley, a convicted mass murderer and one of the most notorious figures in 20th-century history, had ever expressed true contrition or lived a normal life. ********** The landscape surrounding Son My is still covered with rice paddies, as it was 50 years ago. There are still water buffalo fertilizing the fields and chickens roaming. Most of the roads are still dirt. On a recent Wednesday afternoon, ten young men were drinking beer and smoking cigarettes at the side of one of those roads. A karaoke machine was set up on a motorbike, and the loudspeakers were placed next to a blink-and-you-miss-it plaque with an arrow pointing to a “Mass Grave of 75 Victims.” Tran Nam was 6 years old when he heard gunshots from inside his mud and straw home in Son My. It was early morning and he was having breakfast with his extended family, 14 people in all. The U.S. Army had come to the village a couple of times previously during the war. Nam’s family thought it would be like before; they’d be gathered and interviewed and then let go. So the family kept on eating. “Then a U.S. soldier stepped in,” Nam told me. “And he aimed into our meal and shot. People collapsed one by one.” Nam saw the bullet-ridden bodies of his family falling—his grandfather, his parents, his older brother, his younger brother, his aunt and cousins. He ran into a dimly lit bedroom and hid under the bed. He heard more soldiers enter the house, and then more gunshots. He stayed under the bed as long as he could, but that wasn’t long because the Americans set the house on fire. When the heat grew unbearable, Nam ran out the door and hid in a ditch as his village burned. Of the 14 people at breakfast that morning, 13 were shot and 11 killed. Only Nam made it out physically unscathed. The six U.S. Army platoons that swept through Son My that day included 100 men from Charlie Company and 100 from Bravo Company. They killed some civilians straight off—shooting them point blank or tossing grenades into their homes. In the words of Varnado Simpson, a member of Second Platoon who was interviewed for the book Four Hours in My Lai, “I cut their throats, cut off their hands, cut out their tongue, their hair, scalped them. I did it. A lot of people were doing it, and I just followed. I lost all sense of direction.” Simpson went on to commit suicide. Soldiers gathered together villagers along a trail going through the village and also along an irrigation ditch to the east. Calley and 21-year-old Pvt. First Class Paul Meadlo mowed the people down with M-16s, burning through several clips in the process. The soldiers killed as many as 200 people in those two areas of Son My, including 79 children. Witnesses said Calley also shot a praying Buddhist monk and a young Vietnamese woman with her hands up. When he saw a 2-year-old boy who had crawled out of the ditch, Calley threw the child back in and shot him. Truong Thi Le, then a rice farmer, told me she was hiding in her home with her 6-year-old son and 17-year-old daughter when the Americans found them and dragged them out. When the soldiers fired an M-16 into their group, most died then and there. Le fell on top of her son and two bodies fell on top of her. Hours later, they emerged from the pile alive. “When I noticed that it was quiet, I pushed the dead bodies above me aside,” she told me. “Blood was all over my head, my clothes.” She dragged her son to the edge of a field and covered him with rice and cloth. “I told him not to cry or they would come to kill us.” When I asked about her daughter, Le, who had maintained her composure up till that point, covered her face with her hands and broke down in tears. She told me that Thu was killed along with 104 people at the trail but didn’t die right away. When it was safe to move, Le found Thu sitting and holding her grandmother, who was already dead. “Mom, I’m bleeding a lot,” Le remembers her daughter saying. “I have to leave you.” Nguyen Hong Man, 13 at the time of the massacre, told me he went into an underground tunnel with his 5-year-old niece to hide, only to watch her get shot right in front of him. “I lay there, horrified,” he said. “Blood from the nearby bodies splashed onto my body. People who were covered with a lot of blood and stayed still got the chance to survive, while kids did not. Many of them died as they cried for their parents in terror.” Initially, the U.S. Army portrayed the massacre as a great victory over Viet Cong forces, and that story might never have been challenged had it not been for a helicopter gunner named Ronald Ridenhour. He wasn’t there himself, but a few weeks after the operation, his friends from Charlie Company told him about the mass killing of civilians. He did some investigating on his own and then waited until he finished his service. Just over a year after the massacre, Ridenhour sent a letter to about two dozen members of Congress, the secretaries of state and defense, the secretary of the Army, and the chairman of the Joints Chiefs of Staff, telling them about a “2nd Lieutenant Kally” who had machine-gunned groups of unarmed civilians. Ridenhour’s letter spurred the inspector general of the Army, Gen. William Enemark, to launch a fact-finding mission, led by Col. William Wilson. At a hotel in Terre Haute, Indiana, Wilson spoke to Meadlo, the soldier who with Calley had gunned down the rows of villagers. Meadlo had been discharged from the Army because of a severe injury; like many others who’d been at Son My, he was essentially granted immunity when the investigation began. As he described what he’d done and witnessed, he looked at the ceiling and wept. “We just started wiping out the whole village,” he told Wilson. A subsequent inquiry by the Army’s Criminal Investigation Command discovered that military photographer Ronald Haeberle had taken photos during the operation. In a hotel room in Ohio, before a stunned investigator, Haeberle projected on a hung-up bedsheet horrifying images of piled dead bodies and frightened Vietnamese villagers. Armed with Haeberle’s photos and 1,000 pages of testimony from 36 witnesses, the Army officially charged Calley with premeditated murder—just one day before he was scheduled to be discharged. Eighteen months later, in March 1971, a court-martial with a jury of six fellow officers, including five who had served in Vietnam, found Calley guilty of murdering at least 22 civilians and sentenced him to life in prison. The day the verdict came down, Calley defended his actions in a statement to the court: “My troops were getting massacred and mauled by an enemy I couldn’t see, I couldn’t feel and I couldn’t touch—that nobody in the military system ever described them as anything other than Communism. They didn’t give it a race, they didn’t give it a sex, they didn’t give it an age. They never let me believe it was just a philosophy in a man’s mind. That was my enemy out there.” ********** Despite the overwhelming evidence that Calley had personally killed numerous civilians, a survey found that nearly four out of five Americans disagreed with his guilty verdict. His name became a rallying cry on both the right and the left. Hawks said Calley had been simply doing his job. Doves said Calley had taken the fall for the generals and politicians who’d dragged America into a disastrous and immoral conflict. In newspaper articles around the world, one word became entwined with Calley’s name: scapegoat. Within three months of the verdict, the White House received more than 300,000 letters and telegrams, almost all in support of the convicted soldier. Calley himself received 10,000 letters and packages a day. His military defense counsel, Maj. Kenneth Raby, who spent 19 months working on the court-martial, told me Calley received so much mail that he had to be moved to a ground-floor apartment at Fort Benning where the deliveries didn’t have to be carried up the stairs. Some of Calley’s supporters went to great lengths. Two musicians from Muscle Shoals, Alabama, released a recording called “The Battle Hymn of Lt. Calley,” which included the line, “There’s no other way to wage a war.” It sold more than a million copies. Digger O’Dell, a professional showman based in Columbus, Georgia, buried himself alive for 79 days in a used-car lot. Passersby could drop a coin into a tube that led down to O’Dell’s “grave,” with the proceeds going toward a fund for Calley. He later welded shut the doors of his car, refusing to come out until Calley was set free. Politicians, noting the anger of their constituents, made gestures of their own. Indiana Gov. Edgar Whitcomb ordered the state’s flags to fly at half-staff. Gov. John Bell Williams of Mississippi said his state was “about ready to secede from the Union” over the Calley verdict. Gov. Jimmy Carter, the future president, urged his fellow Georgians to “honor the flag as Rusty had done.” Local leaders across the country demanded that President Nixon pardon Calley. Nixon fell short of a pardon, but he ordered that Calley remain under house arrest in his apartment at Fort Benning, where he could play badminton in the backyard and hang out with his girlfriend. After a series of appeals, Calley’s sentence was cut from life to 20 years, then in half to ten years. He was set free in November 1974 after serving three and a half years, most of it at his apartment. In the months after his release, Calley made a few public appearances, and then moved a 20-minute drive down the road to Columbus, Georgia, where he disappeared into private life. ********** Situated along the Chattahoochee River, Columbus is first and foremost a military town. Its residents’ lives are linked to Fort Benning, which has served as the home of the U.S. Infantry School since 1918 and today supports more than 100,000 civilian and military personnel. “The Army is just a part of day-to-day life here,” the longtime Columbus journalist Richard Hyatt told me. “And back in the day, William Calley was part of that life.” Bob Poydasheff, the former mayor of Columbus, says there was controversy when Calley moved to town. “There were many of us who were just horrified,” he told me, raising his voice until he was almost shouting. “It’s just not done! You don’t go and kill unarmed civilians!” Still, Calley became a familiar face around Columbus. In 1976, he married Penny Vick, whose family owned a jewelry shop frequented by members of Columbus’ elite. One of their wedding guests was U.S. District Judge J. Robert Elliott, who had tried to get Calley’s conviction overturned two years earlier. After the wedding, Calley began working at the jewelry shop. He took classes to improve his knowledge of gemstones and got trained to make appraisals to increase the store’s business. In the 1980s, he applied for a real estate license and was initially denied because of his criminal record. He asked Reid Kennedy, the judge who had presided over his court-martial, if he’d write him a letter. He did so, and Calley got the license while continuing to work at the shop. “It’s funny isn’t it, that a man who breaks into your house and steals your TV will never get a license, but a man who’s convicted of killing 22 people can get one,” Kennedy told the Columbus Ledger-Enquirer in 1988. Al Fleming, a former local TV news anchor, described Calley as a soft-spoken man. When I met Fleming in Columbus over a steak dinner, one of the first things he told me was, “I’m not going to say anything bad about Rusty Calley....He and I were the best of friends for a long time. We still are, as far as I’m concerned.” (Calley left town some years back and now lives in Gainesville, Florida.) Fleming described how Calley used to sit with him at the restaurant he owned, Fleming’s Prime Time Grill, and talk late into the night about Vietnam. He told Fleming that Charlie Company had been sent to My Lai to “scorch the earth,” and that even years after his conviction, he still felt he’d done what he’d been ordered to do. After our dinner, Fleming gave me a tour in his tiny red Fiat, pausing to point out the house where Calley lived for nearly 30 years. He also pointed out an estate nearby that had appeared in The Green Berets, a pro-war 1968 film starring John Wayne. The Army had participated heavily in the production, providing uniforms, helicopters and other equipment. The battle scenes were filmed at Fort Benning, and a house in Columbus was used as a stand-in for a Viet Cong general’s villa. In the 1980s, the Green Beret house caught fire. When the neighbors rushed out to form a bucket brigade, Calley was right there with everyone else, trying to put out the flames. During his time in Columbus, Calley mostly succeeded in keeping himself out of the national spotlight. (Hyatt, the journalist, used to go to V.V. Vick Jewelers every few years, on the anniversary of the massacre, to try to get an interview with Calley, but was always politely denied.) Calley and Penny had one son, William Laws Calley III, known as Laws, who went on to get a PhD in electrical engineering at Georgia Tech. But divorce documents I found at the Muscogee County clerk’s office present a dismal picture. According to a legal brief filed by Calley’s attorney in 2008, he spent most of his adult years feeling powerless both at work and at home. It states that Calley did all the cooking, and all the cleaning that wasn’t done by the maid, and that he was their son’s primary caretaker. The jewelry store, according to the document, “was his life and, except for his son, was where he derived his self-worth....He even worked hard to try to infuse new ideas into the store to help it grow and be more profitable, all of which were rejected by Mrs. Calley.” In 2004, his wife, who inherited the store from her parents, stopped paying him a salary. He fell into a depression and moved to Atlanta to stay with Laws, living off his savings until it was gone. Calley and his son remain close. The divorce documents provided little information about Penny Vick’s side of the story apart from two ambiguous details. (Vick and Laws also declined to be interviewed for this story.) His lawyer disputed one assertion—that Calley “had been backing away from his marital relationship” prior to separation—but confirmed the other assertion—that Calley “consumed alcoholic beverages in his own area of the home on a daily basis.” In a strange twist, John Partin, the lawyer who represented Calley’s wife in the divorce, was a former Army captain who had served as an assistant prosecutor in Calley’s court-martial. “I’m proud of what we did,” Partin told me, referring to the nearly two years he spent trying to put Calley in prison. He and his co-counsel called about 100 witnesses to testify against Calley. When Nixon intervened to keep Calley out of jail, Partin wrote a letter to the White House saying that the special treatment accorded a convicted murderer had “defiled” and “degraded” the military justice system. By the time the divorce was settled, according to the court documents, Calley was suffering from prostate cancer and gastrointestinal problems. His lawyer described his earning capacity as “zero based upon his age and health.” He asked Penny for a lump alimony sum of $200,000, half of their home equity, half of the individual retirement account in Penny’s name, two baker’s shelves and a cracked porcelain bird that apparently held emotional significance. ********** The closest Calley ever came to publicly apologizing for My Lai was at a 2009 meeting of the Kiwanis Club of Greater Columbus. Fleming set up the talk, on a Wednesday afternoon. No reporters were invited, but a retired local newsman surreptitiously blogged about it online and the local paper picked up the story. “There is not a day that goes by that I do not feel remorse for what happened that day in My Lai,” Calley told the 50 or so Kiwanis members. “I feel remorse for the Vietnamese who were killed, for their families, for the American soldiers involved and their families. I am very sorry.” The historian Howard Jones, author of My Lai: Vietnam, 1968, and the Descent into Darkness, read Calley’s words in news reports but didn’t believe they showed true contrition. “There just was no inner change of heart,” Jones told me. “I mean it just wasn’t there. No matter how people tried to paint it.” Jones especially took issue with the fact that Calley insisted in the Kiwanis speech that he’d only been following orders. It’s still unclear exactly what Capt. Ernest L. Medina told the men of Charlie Company the night before they were helicoptered into Son My. (He did not respond to interview requests for this story.) The captain reportedly told his soldiers that they were finally going to meet the Viet Cong’s 48th Local Force Battalion, a well-armed division of at least 250 soldiers, which for months had tormented them. Medina later claimed that he’d never told his men to kill innocent civilians. He testified at Calley’s court-martial that Calley had “hemmed and hawed” before admitting the extent of the slaughter. He said Calley told him, two days after the massacre, “I can still hear them screaming.” Medina himself was charged, tried and found innocent. Compelling, comprehensive, and haunting, based on both exhaustive archival research and extensive interviews, Howard Jones's My Lai will stand as the definitive book on one of the most devastating events in American military history. I wanted to get firsthand reports from other Charlie Company men who were at Son My, so I started making calls and writing letters. I eventually reached five former soldiers willing to speak on the record. Dennis Bunning, a former private first class in Second Platoon who now lives in California, remembered Medina’s pep talk this way: “We’re going to get even with them for all the losses we’ve had. We’re going in there, we’re killing everything that’s alive. We’re throwing the bodies down the wells, we’re burning the villages, and we’re wiping them off of the map.” It would have been a compelling message for young men who had spent the previous months getting attacked by invisible forces. They had lost friends to booby traps, land mines and sniper fire. By March 16, Charlie Company alone had suffered 28 casualties, five dead and many others permanently maimed, without once engaging directly with an enemy combatant. “Most of everything that was going on was insanity in my view. It was trying to survive,” said Lawrence La Croix of Utah, who was only 18 when he went into Son My as a Second Platoon squad leader. “The problem is, when you step on a mine or a booby trap there’s nothing to take your anger out on. It’s not like a firefight where you get to shoot back. You can’t shoot a mine. It doesn’t really care.” “All your friends are getting killed and there is nobody to fight,” echoed John Smail, Third Platoon squad leader, now living in Washington State. “So when we thought we had a chance to meet them head-on, we were pumped.” Kenneth Hodges, a former sergeant, who is now living in rural Georgia, told me he was devastated when he heard of Calley’s partial apology at the Columbus Kiwanis Club. “I felt like crying, really, because he had nothing to apologize for,” said Hodges. “I know today I don’t have anything to apologize for. I went to Vietnam and I served two tours and I served honorably. On that particular operation, I carried out the order as it was issued. A good soldier receives, obeys and carries out the orders that he is issued, and he reports back. That’s the way it was in ’68. That’s the way I was trained.” In contrast, Meadlo expressed intense remorse. He is living in Indiana, and he says that as he gets older the memories of My Lai come back more frequently, not less. “When I’m sleeping, I can actually see the faces, and that’s the honest-to-God truth,” he told me. “I can actually see the faces and the terror and all those people’s eyes. And I wake up and I’m just shaking and I just can’t hardly cope with it. The nightmares and everything will never go away. I’m sure of that. But I have to live with it.” Meadlo stood 10 to 15 feet away from a group of villagers and went through at least four clips of 17 bullets each. He almost certainly killed relatives of the people I spoke with in Vietnam. It might have been Meadlo’s bullets that struck Truong Thi Le’s daughter or his Zippo that burned Tran Nam’s home. The day after the massacre, Meadlo stepped on a land mine and his right foot was blown off. As he was whisked away on a helicopter, Meadlo reportedly shouted, “Why did you do it? This is God’s punishment to me, Calley, but you’ll get yours! God will punish you, Calley!” Meadlo is still angry at the U.S. government for sending him to Vietnam in the first place, but he says he no longer holds a grudge against Calley. “I think he believed that he was doing his duty and doing his job when he was over there,” he told me. “He might have got sidetracked.” ********** Tran Nam, the Son My villager who hid under a bed as a 6-year-old while his family fell around him, is now 56 years old. He works as a gardener at the Son My Vestige Site, a small museum dedicated to the memory of all those killed in 1968. The garden contains the brick bases of 18 out of the 247 homes that were otherwise destroyed that day. In front of each is a plaque with the name of the family that lived there and a list of the members of that family who were killed. Inside the museum, items that once belonged to the people of Son My sit in glass cases: the rosary beads and Buddhist prayer book of the 65-year-old monk Do Ngo, the round-bellied fish sauce pot of 40-year-old Nguyen Thi Chac, the iron sickle of 29-year-old Phung Thi Muong, a single slipper of 6-year-old Truong Thi Khai and the stone marbles of two young brothers. One case displays a hairpin that belonged to 15-year-old Nguyen Thi Huynh; her boyfriend held onto it for eight years after the massacre before donating it to the museum. At the museum’s entrance is a large black marble plaque that bears the names and ages of every person killed in Son My on March 16, 1968. The list includes 17 pregnant women and 210 children under the age of 13. Turn left and there is a diorama of how the village looked before every dwelling was burned down. The walls are lined with Ronald Haeberle’s graphic photos, as well as pictures of Calley and other soldiers known to have committed atrocities, including Meadlo and Hodges. American heroes are celebrated, like Ronald Ridenhour, the ex-G.I. who first exposed the killings (he died in 1998), and Hugh Thompson, a pilot, and Lawrence Colburn, a gunner, who saved nine or ten civilians the day of the massacre by airlifting them on their helicopter (both Thompson and Colburn later died of cancer). There are also photos of former U.S. soldiers who have visited the museum, including a Vietnam veteran named Billy Kelly who has 504 roses delivered to the museum on the anniversary of the massacre every year. Sometimes he brings them personally. The director of the museum, Pham Thanh Cong, is a survivor himself. He was 11 years old when he and his family heard the Americans shooting and hid in a tunnel underneath their home. As the soldiers approached, Cong’s mother told him and his four siblings to move deeper inside. A member of the U.S. Army then threw a grenade into the tunnel, killing everyone except Cong, who was injured by the shrapnel and still bears a scar next to his left eye. When we sat down, Cong thanked me for coming to the museum, for “sharing the pain of our people.” He told me it had been a complete surprise when the troops entered the village. “No one fought back,” he said. “After four hours, they killed the entire village and withdrew, leaving our village full of blood and fire.” Cong’s full-time job is to make sure the massacre is not forgotten. For Americans, My Lai was supposed to be a never-again moment. In 1969, the antiwar movement turned one of Haeberle’s photographs of dead women and children into a poster, overlaid with a short, chilling quote from Meadlo: “And babies.” It was largely because of My Lai that returning Vietnam veterans were widely derided as “baby killers.” Even decades later, military personnel used the massacre as a cautionary tale, a reminder of what can happen when young soldiers unleash their rage on civilians. “No My Lais in this division—do you hear me?” Maj. Gen. Ronald Griffith told his brigade commanders before entering battle in the Persian Gulf War. Yet Cong and the other survivors are painfully aware that all the soldiers involved in the massacre went free. The only one to be convicted was released after a brief and comfortable captivity. I asked Cong whether he would welcome a visit from Calley. “For Vietnamese people, when a person knows his sin, he or she must repent, pray and acknowledge it in front of the spirits,” Cong told me. “Then he will be forgiven and his mind will be relaxed.” Indeed, the home of every survivor I interviewed had an altar in the living room, where incense was burned and offerings were made to help the living venerate dead family members. It seems unlikely that Calley will make that trip. (Smithsonian offered him the opportunity to accompany me to Vietnam and he declined.) “If Mr. Calley does not return to Vietnam to repent and apologize to the 504 spirits who were killed,” Cong told me, “he will always be haunted, constantly obsessed until he dies, and even when he dies, he won’t be at peace. So I hope he will come to Vietnam. These 504 spirits will forgive his sins, his ignorant mind that caused their death.” This article is a selection from the January/February issue of Smithsonian magazine
9da6d16fcd7e7ae7f97c4372405e3871
https://www.smithsonianmag.com/history/going-nuclear-over-the-pacific-24428997/
Going Nuclear Over the Pacific
Going Nuclear Over the Pacific The summer of 2012 will be remembered as a time when people around the world were caught up in events in the skies above Mars, where the rover Curiosity eventually touched down onto the red planet. Fifty years ago this summer there were strange doings in the skies above earth as well. In July 1962, eight airplanes, including five commercial flights, plummeted to the ground in separate crashes that killed hundreds. In a ninth incident that month, a vulture smashed through the cockpit window of an Indian Airlines cargo plane, killing the co-pilot. Higher in the atmosphere, cameras mounted in U-2 spy planes soaring above the Carribean captured images of Soviet ships that, unbeknownst to the U.S. at the time, were carrying missiles to Cuba. In gray skies over Cape Cod, a 20-year-old telephone operator named Lois Ann Frotten decided to join her new fiancé in a celebratory jump from an airplane at 2,500 feet. It was her first attempt at skydiving. While her fiancé landed safely, Frotten’s chute got tangled and failed to open fully. She tumbled end over end and landed feet-first in Mystic Lake with a terrific splash—and survived the half-mile free fall with a cut nose and two small cracked vertebrae. “I’ll never jump again,” she told rescuers as she was pulled from the lake. But of all the things happening in the skies that summer, nothing would be quite as spectacular, surreal and frightening as the military project code-named Starfish Prime. Just five days after Americans across the country witnessed traditional Fourth of July fireworks displays, the Atomic Energy Commission created the greatest man-made light show in history when it launched a thermonuclear warhead on the nose of a Thor rocket, creating a suborbital nuclear detonation 250 miles above the Pacific Ocean. In the fifty minutes that followed, witnesses from Hawaii to New Zealand were treated to a carnival of color as the sky was illuminated in magnificent rainbow stripes and an artificial aurora borealis. With a yield of 1.45 megatons, the hydrogen bomb was approximately 100 times more powerful than the atomic bomb dropped on Hiroshima 17 years before. Yet scientists underestimated the effects of the bomb and the resulting radiation. Knowledge of radiation in space was still fragmentary and new. It was only four years before that James A. Van Allen, a University of Iowa physicist who had been experimenting with Geiger counters on satellites, claimed to have discovered that the planet was encircled by a “deadly band of X-rays,” and that radiation from the sun “hit the satellites so rapidly and furiously” that the devices jammed. Van Allen announced his findings on May 1, 1958, at a joint meeting of the National Academy of Sciences and the American Physical Society, and the following day, the Chicago Tribune bannered the headline, “Radiation Belt Dims Hope of Space Travel.” The story continued: “Death, lurking in a belt of unexpectedly heavy radiation about 700 miles above the earth, today dimmed man’s dreamed of conquering outer space.” News of the “hot band of peril” immediately cast doubt on whether Laika, the Russian dog, would have been able to survive for a week in space aboard Sputnik II, as the Soviets claimed, in November of 1957. (The Soviets said that after six days, the dog’s oxygen ran out and she was euthanized with poisoned food. It was later learned that Laika, the first live animal to be launched into space, died just hours after the launch from overheating and stress, when a malfunction in the capsule caused the temperature to rise.) What Van Allen had discovered were the bands of high-energy particles that were held in place by strong magnetic fields, and soon known as the Van Allen Belts. A year later, he appeared on the cover of Time magazine as he opened an entirely new field of research—magnetospheric physics—and catapulted the United States into the race to space with the Soviet Union. On the same day Van Allen held his press conference in May 1958, he agreed to cooperate with the U.S. military on a top-secret project. The plan: to send atomic bombs into space in an attempt to blow up the Van Allen Belts, or to at least disrupt them with a massive blast of nuclear energy. At the height of the Cold War, the thinking may have been, as the science historian James Fleming said recently, that “if we don’t do it, the Russians will.” In fact, over the next few years, both the United States and the Soviet Union tested atomic bombs in space, with little or no disruption in the Van Allen Belts. Fleming suspects that the U.S. military may have theorized that the Van Allen belts could be used to attack the enemy. But in July 1962, the United States was ready to test a far more powerful nuclear bomb in space The first Starfish Prime launch, on June 20, 1962, at Johnston Island in the Pacific, had to be aborted when the Thor launch vehicle failed and the missile began to break apart. The nuclear warhead was destroyed mid-flight, and radioactive contamination rained back down on the island. Despite protests from Tokyo to London to Moscow citing “the world’s violent opposition” to the July 9 test, the Honolulu Advertiser carried no ominous portent with its headline, “N-Blast Tonight May Be Dazzling; Good View Likely,” and hotels in Hawaii held rooftop parties. The mood on the other side of the planet was somewhat darker. In London, England, 300 British citizens demonstrated outside the United States Embassy, chanting “No More Tests!” and scuffling with police. Canon L. John Collins of St. Paul’s Cathedral called the test “an evil thing,” and said those responsible were “stupid fools.” Izvestia, the Soviet newspaper, carried the headline, “Crime of American Atom-mongers: United States Carries Out Nuclear Explosion in Space.” Soviet film director Sergei Yutkevich told the paper, “We know with whom we are dealing: yet we hoped, until the last moment, that the conscience, if not the wisdom, of the American atom-mongers would hear the angry voices of millions and millions of ordinary people of the earth, the voices of mothers and scientists of their own country.” (Just eight months before, the Soviets tested the Tsar Bomba, the most powerful nuclear weapon ever detonated—a 50-megaton hydrogen bomb—on an archipelago in the Arctic Ocean in the north of Russia.) Just after 11 p.m. Honolulu time on July 9, the 1.45-megaton hydrogen bomb was detonated thirteen minutes after launch. Almost immediately, an electromagnetic pulse knocked out electrical service in Hawaii, nearly 1,000 miles away. Telephone service was disrupted, streetlights were down and burglar alarms were set off by a pulse that was much larger than scientists expected. Suddenly, the sky above the Pacific was illuminated by bright auroral phenomena. “For three minutes after the blast,” one reporter in Honolulu wrote, “the moon was centered in a sky partly blood-red and partly pink. Clouds appeared as dark silhouettes against the lighted sky.” Another witness said, “A brilliant white flash burned through the clouds rapidly changing to an expanding green ball of irradiance extending into the clear sky above the overcast.” Others as far away as the Fiji Islands—2,000 miles from Johnston Island—described the light show as “breathtaking.” In Maui, a woman observed auroral lights that lasted a half hour in “a steady display, not pulsating or flickering, taking the shape of a gigantic V and shading from yellow at the start to dull red, then to icy blue and finally to white.” “To our great surprise and dismay, it developed that Starfish added significantly to the electrons in the Van Allen belts,” Atomic Energy Commission Glenn Seaborg wrote in his memoirs. “This result contravened all our predictions.” More than half a dozen satellites had been victimized by radiation from the blast. Telstar, the AT&T communications satellite launched one day after Starfish, relayed telephone calls, faxes and television signals until its transistors were damaged by Starfish radiation. (The Soviets tested their own high-altitude thermonuclear device in October 1962, which further damaged Telstar’s transistors and rendered it useless.) Both the Soviets and the United States conducted their last high-altitude nuclear explosions on November 1, 1962. It was also the same day the Soviets began dismantling their missiles in Cuba. Realizing that the two nations had come close to a nuclear war, and prompted by the results of Starfish Prime and continuing atomic tests by the Soviets, President John F. Kennedy and Premier Nikita Khrushchev signed the Limited Nuclear Test Ban Treaty on July 25, 1963, banning atmospheric and exoatmospheric nuclear testing. And while the U.S. and the Soviet Union would continue their race to space at full throttle, for the time being, the treaty significantly slowed the arms race between the two superpowers. Sources Books: James Clay Moltz, The Politics of Space Security: Strategic Restraint and the Pursuit of National Interests, Stanford University Press, 2008. Rosemary B. Mariner and G. Kurt Piehler, The Atomic Bomb and American Society: New Perspectives, The University of Tennessee Press, 2009. Articles: “H-Blast Seen 4000 Miles, Triggers Russian Outcry,” Boston Globe, July 10, 1962. “Britons Protest Outside Embassy,” New York Times, July 10, 1962. “Pacific Sky Glows After Space Blast,” Hartford Courant, July 10, 1962. “Blackouts Last Only About Hour,” New York Times, July 10, 1962. “How Not to Test in Space” by Michael Krepon, The Stimson Center, November 7, 2011, http://www.stimson.org/summaries/how-not-to-test-in-space-/ “A Very Scary Light Show: Exploding H-Bombs in Space” Krulwich Wonders, NPR, All Things Considered, July 1, 2010, http://www.npr.org/templates/story/story.php?storyId=128170775 “9 July 1962 ‘Starfish Prime’, Outer Space” The Comprehensive Nuclear-Test-Ban-Treaty-Organization Preparatory Commission, http://www.ctbto.org/specials/infamous-anniversaries/9-july-1962starfish-prime-outer-space/ “Nuclear Test Ban Treaty” John F. Kennedy Presidential Library and Museum, http://www.jfklibrary.org/JFK/JFK-in-History/Nuclear-Test-Ban-Treaty.aspx Gilbert King is a contributing writer in history for Smithsonian.com. His book Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America won the Pulitzer Prize in 2013.
486efe29aa4d96bc571671c7a816998b
https://www.smithsonianmag.com/history/great-fire-london-was-blamed-religious-terrorism-180960332/
The Great Fire of London Was Blamed on Religious Terrorism
The Great Fire of London Was Blamed on Religious Terrorism The rumors spread faster than the blaze that engulfed London over five days in September 1666: that the fire raging through the city’s dense heart was no accident – it was deliberate arson, an act of terror, the start of a battle. England was at war with both the Dutch and the French, after all. The fire was a “softening” of the city ahead of an invasion, or they were already here, whoever “they” were. Or maybe it was the Catholics, who’d long plotted the downfall of the Protestant nation. Londoners responded in kind. Before the flames were out, a Dutch baker was dragged from his bakery while an angry mob tore it apart. A Swedish diplomat was nearly hung, saved only by the Duke of York who happened to see him and demand he be let down. A blacksmith “felled” a Frenchman in the street with a vicious blow with an iron bar; a witness recalled seeing his “innocent blood flowing in a plentiful stream down his ankles”. A French woman’s breasts were cut off by Londoners who thought the chicks she carried in her apron were incendiaries. Another Frenchman was nearly dismembered by a mob that thought that he was carrying a chest of bombs; the bombs were tennis balls. “The need to blame somebody was very, very strong,” attests Adrian Tinniswood, author of By Permission of Heaven: The Story of the Great Fire. The Londoners felt that “It can’t have been an accident, it can’t be God visiting this upon us, especially after the plague, this has to be an act of war.” As far as we know, it wasn’t. The fire started in the early hours of the morning of September 2 on Pudding Lane in the bakery of Thomas Farriner. Pudding Lane was (and still is) located in the centre of the City of London, the medieval city of around one square mile ringed by ancient Roman walls and gates and rivers now covered and forgotten. Greater London built up around these walls in the years after the Romans left in the 4th century, sprawling out in all directions, but the City of London remained (and still remains) its own entity, with its own elected Mayor and home to around 80,000 people in 1666. That number would have been higher, but the Black Plague had killed roughly 15 percent of the entire city’s population the previous year. Farriner was a maker of hard tack, the dry but durable biscuits that fed the King’s Navy; he’d closed for business on Saturday, September 1, at around 8 or 9 that night, extinguishing the fire in his oven. His daughter, Hanna, then 23, checked the kitchen at around midnight, making sure the oven was cold, then headed to bed. An hour later, the ground floor of the building was filled with smoke. The Farriners’ manservant, Teagh, raised the alarm, climbing to the upper floors where Thomas, Hanna, and their maid slept. Thomas, Hanna, and Teagh squeezed out of a window and scrambled along the gutter to a neighbor’s window. The maid, whose name remains unknown, did not and was the first to die in the fire. At first, few were overly concerned about the fire. London was a cramped, overcrowded city lighted by candles and fireplaces. Buildings were largely made of wood; fires were common. The last major fire was in 1633, destroying 42 buildings at the northern end of London Bridge and 80 on Thames Street, but there were smaller fires all the time. The City of London’s Lord Mayor at the time, Sir Thomas Bloodworth, will ever be remembered as the man who declared that the 1666 fire was so small, “a woman might piss it out”. But Bloodworth, described by diarist Samuel Pepys as a “silly man”, wasn’t the only one to underestimate the fire: Pepys himself was woken at 3 that morning by his maid, but when he saw that the fire still seemed to be on the next street over, went back to sleep until 7. The London Gazette, the city’s twice-weekly newspaper, ran a small item about the fire in its Monday edition, among gossip about the Prince of Saxe’s unconsummated marriage to the Princess of Denmark and news of a storm in the English Channel. A second report on the fire that week, however, was not forthcoming. Within hours of printing Monday’s paper, the Gazette’s press burned to the ground. By the time the newspaper had hit the streets, Londoners were very much aware that the fire that the Gazette reported “continues still with great violence” had yet to abate. Several factors contributed to the fire’s slow but unstoppable spread: Many of the residents of Pudding Lane were asleep when the fire began and slow to react, not that they could have done much beyond throw buckets of whatever liquid – beer, milk, urine, water – was on hand. A hot summer had left London parched, its timber and plaster buildings like well-dried kindling. These buildings were so close together that people on opposite sides of the narrow, filthy streets could reach out their windows and shake hands. And because London was the manufacturing and trade engine of England, these buildings were also packed with flammable goods – rope, pitch, flour, brandy and wool. But by Monday evening, Londoners began to suspect that this fire was no accident. The fire itself was behaving suspiciously; it would be subdued, only to break out somewhere else, as far as 200 yards away. This led people to believe that the fire was being intentionally set, although the real cause was an unusually strong wind that was picking up embers and depositing them all over the city. “This wind blowing from the east was forcing the fire across the city much quicker than people were expecting,” explains Meriel Jeater, curator of the Museum of London’s “Fire! Fire! Exhibition,” commemorating the 350th anniversary of the fire. Sparks would fly up and set fire to whatever they landed on. “It seemed that suddenly, another building was on fire and it was, ‘Why did that happen?’ They didn’t necessarily think there was spark involved, or another natural cause… England was at war, so it was perhaps natural to assume that there might have been some element of foreign attack to it.” Embers and wind didn’t feel like a satisfying or likely answer, so Londoners started to feel around for someone to blame. And they found them. At the time, London was the third largest city in the Western world, behind Constantinople and Paris, and roughly 30 times larger than any other English town. And it was international, with trade links all over the world, including countries that it was at war with, Holland and France, and those it wasn’t entirely comfortable with, including Spain. London was also a refuge for foreign Protestants fleeing persecution in their majority Catholic homelands, including the Flemish and French Huguenots. That people believed that the city was under attack, that the fire was the plot of either the Dutch or the French, was logical, not paranoia. The English had just burnt the Dutch port city of West-Terschelling to the ground just two weeks earlier. As soon as the fire broke out, Dutch and French immigrants were immediately under suspicion; as the fire burned, the English authorities stopped and interrogated foreigners at ports. More troubling, however, was that Londoners began to take vengeance into their own hands, says Tinniswood. “You’re not looking at a population that can distinguish between a Dutchman, a Frenchman, a Spaniard, a Swede. If you’re not English, good enough.” “The rumors reach a kind of crescendo on the Wednesday night when the fire is subsiding and then breaks out just around Fleet Street,” says Tinniswood. Homeless Londoners fleeing the fire were camped in the fields around the City. A rumor went up that the French were invading the city, then the cry: “Arms, arms, arms!” “They’re traumatized, they’re bruised, and they all, hundreds and thousands of them, they take up sticks and come pouring into the city,” says Tinniswood. “It’s very real… A lot of what the authorities are doing is trying to damp down that sort of panic.” But extinguishing the rumors proved almost as difficult as putting out the fire itself. Rumors traveled fast, for one thing: “The streets are full of people, moving their goods... They’re having to evacuate two, three, four times,” Tinniswood explains, and with each move, they’re out in the street, passing information. Compounding the problem was that there were few official ways able to contradict the rumors – not only had the newspaper’s printing press burned down, but so too did the post office. Charles II and his courtiers maintained that the fire was an accident, and though they were themselves involved in fighting the fire on the streets, there was only so much they could do to also stop the misinformation spreading. Says Tinniswood, “There’s no TV, no radio, no press, things are spread by word of mouth, and that means there must have been a thousand different rumors. But that’s the point of it: nobody knew.” Several people judged to be foreigners were hurt during Wednesday’s riot; contemporaries were surprised that no one had been killed. The next day, Charles II issued an order, posted in places around the city not on fire, that people should “attend the business of quenching the fire” and nothing else, noting that there were enough soldiers to protect the city should the French actually attack, and explicitly stating that the fire was an act of God, not a “Papist plot”. Whether or not anyone believed him was another issue: Charles II had only been restored to his throne in 1660, 11 years after his father, Charles I, was beheaded by Oliver Cromwell’s Parliamentarian forces. The City of London had sided with the Parliamentarians; six years later, Londoners still didn’t entirely trust their monarch. The fire finally stopped on the morning of September 6. Official records put the number of deaths as fewer than 10, although Tinniswood and Jeater both believe that number was higher, probably more like 50. It’s still a surprisingly small number, given the huge amount of property damage: 80 percent of the city within the walls had burned, some 87 churches and 13,200 homes were destroyed, leaving 70,000 to 80,000 people homeless. The total financial loss was in the region of £9.9 million, at a time when the annual income of the city was put at only £12,000. On September 25, 1666, the government set up a committee to investigate the fire, hearing testimony from dozens of people about what they saw and heard. Many were compelled to come forward with “suspicious” stories. The report was given to Parliament on January 22, 1667, but excerpts from the proceedings transcripts were leaked to the public, published in a pamphlet. By this time, just a few months after the fire, the narrative had changed. Demonstrably, the Dutch and the French hadn’t invaded, so blaming a foreign power was no longer plausible. But the people still wanted someone to blame, so they settled on the Catholics. “After the fire, there seems be a lot of paranoia that is was a Catholic plot, that Catholics in London would conspire with Catholics abroad and force the Protestant population to convert to Catholicism,” Jeater explains. The struggle between Catholicism and Protestantism in England had been long and bloody, and neither side was above what amounted to terrorism: The Gunpowder Plot of 1605 was, after all, an English Catholic plot to assassinate James I. The official report issued to Parliament rejected much of the testimony as unbelievable – one committee member called the allegations “very frivolous”, and the conclusion declared there was no evidence “to prove it to be a general design of wicked agents, Papists or Frenchmen, to burn the city”. It didn’t matter: The leaked excerpts did much to solidify the story that the fire was the work of shadowy Catholic agents. For example: William Tisdale informs, That he being about the beginning of July at the Greyhound in St. Martins, with one Fitz Harris an Irish Papist, heard him say, ‘There would be a sad Desolation in September, in November a worse, in December all would be united into one.’ Whereupon he asked him, ‘where this Desolation would be?’ He answered, ‘In London.’ Or: Mr. Light of Ratcliff, having some discourse with Mr. Longhorn of the Middle-Temple, Barrister, [reputed a zealous Papist] about February 15 last, after some discourse in disputation about Religion, he took him by the hand, and said to him, ‘You expect great things in Sixty Six, and think that Rome will be destroyed, but what if it be London?’ “You’ve got hundreds of tales like that: With hindsight, people are saying that guy said something like, ‘London better look out’,” said Tinniswood. “It’s that kind of level, it’s that vague.” What’s even more confusing is that by the time the testimonies were leaked, someone had already confessed to and been hung for the crime of starting the fire. Robert Hubert. a 26-year-old watchmaker’s son from Rouen, France, had been stopped at Romford, in Essex, trying to make it to the east coast ports. He was brought in for questioning and bizarrely, told authorities that he’d set the fire, that he was part of a gang, that it was all a French plot. He was indicted on felony charges, transported back to London under heavy guard and installed at the White Lion Gaol in Southwark, the City’s gaols having burned down. In October 1666, he was brought to trial at the Old Bailey. There, Hubert’s story twisted and turned – the number of people in his gang went from 24 to just four; he’d said he’d started it in Westminster, then later, after spending some time in jail, said the bakery at Pudding Lane; other evidence suggested that he hadn’t even been in London when the fire started; Hubert claimed to be a Catholic, but everyone who knew him said he was a Protestant and a Hugeunot. The presiding Lord Chief Justice declared Hubert’s confession so “disjointed” he couldn’t possibly believe him guilty. And yet, Hubert insisted that he’d set the fire. On that evidence, the strength of his own conviction that he had done it, Hubert was found guilty and sentenced to death. He was hung at Tyburn on October 29, 1666. Why Hubert said he did it remains unclear, although there is a significant body of literature on why people confess to things they couldn’t possibly have done. Officials were in the strange position of trying to prove he hadn’t done what he said he did, but Hubert was adamant – and everyone else simply thought he was, to put it in contemporary terms, mad. The Earl of Clarendon, in his memoirs, described Hubert as a “poor distracted wretch, weary of his life, and chose to part with it this way” – in other words, suicide by confession. Having someone to blame was certainly better than the alternative being preached from the city’s remaining pulpits: That the fire was God’s vengeance on a sinful city. They’d even named a particular sin – because the fire started at a bakery on Pudding Lane and ended at Pie Corner, opportunistic preachers took the line that Londoners were gluttonous reprobates who needed to repent now. Pie Corner is still marked with a statue of a plump golden boy, formerly known as the Fat Boy, which was intended as a reminder of London’s sinning ways. The Catholic conspiracy story persisted for years: In 1681, the local ward erected a plaque on the site of the Pudding Lane bakery reading, “Here by the permission of Heaven, Hell broke loose upon this Protestant city from the malicious hearts of barbarous Papists, by the hand of their agent Hubert, who confessed…”. The plaque remained in place until the middle of the 18th century, when it was removed not because people had had a change of heart, but because visitors stopping to read the plaque were causing a traffic hazard. The plaque, which appears to have cracked in half, is on display at the Fire! Fire! exhibition. Also in 1681, a final line was added to the north-face inscription on the public monument to the fire: “But Popish frenzy, which wrought such horrors, is not yet quenched.” The words weren’t removed until 1830, with the Catholic Emancipation Act that lifted restrictions on practicing Catholics. “Whenever there is a new bout of anti-Catholic sentiment, everybody harks back to the fire,” says Tinniswood. And 1681 was a big year for anti-Catholic rhetoric, prompted in part by the dragonnades in France that forced French Protestants to convert to Catholicism and, closer to home, by the so-called “Popish Plot,” a fictitious Catholic conspiracy to assassinate Charles II entirely invented by a former Church of England curate whose false claims resulted in the executions of as many as 35 innocent people. In the immediate aftermath of the fire of 1666, London was a smoking ruin, smoldering with suspicion and religious hatred and xenophobia. And yet within three years, the city had rebuilt. Bigotry and xenophobia subsided – immigrants remained and rebuilt, more immigrants joined them later. But that need to blame, often the person last through the door or the person whose faith is different, never really goes away. “The outsider is to blame, they are to blame, they are attacking us, we’ve got to stop them – that kind of rhetoric is sadly is very obvious… and everywhere at the moment, and it’s the same thing, just as ill-founded,” Tinniswood said, continuing, “There is still a sense that we need to blame. We need to blame them, whoever they are.” Linda Rodriguez McRobbie is an American freelance writer living in London, England. She covers the weird stuff for Smithsonian.com, Boing Boing, Slate, mental_floss, and others, and she's the author of Princesses Behaving Badly.
b4d663af3996f457a5c9e7f23531b32f
https://www.smithsonianmag.com/history/great-los-angeles-air-raid-terrified-citizenseven-though-no-bombs-were-dropped-180967890/
The Great Los Angeles Air Raid Terrified Citizens—Even Though No Bombs Were Dropped
The Great Los Angeles Air Raid Terrified Citizens—Even Though No Bombs Were Dropped This past Saturday, residents of Hawaii were alarmed as cell phones across the island state chimed with an early morning emergency alert. “Ballistic missile threat inbound to Hawaii. Seek immediate shelter. This is not a drill,” the message read. With North Korea launching numerous missiles throughout 2017, and previously threatening to attack the U.S. territory of Guam, Hawaiian citizens—and countless tourists—were quick to assume the worst. For 38 minutes, chaos and panic reigned as people abandoned their cars on the highway to seek shelter before finally receiving word that the alert had been sent on accident. As terrifying as the experience was for those on the archipelago, it’s not the first time an impending attack has turned out to be a false alarm. Take the Battle of Los Angeles, for instance. Never heard of it? That’s because nothing actually happened. Often relegated to a footnote in the history of World War II, the “battle” is a prime example of what can happen when the military and civilians expect an invasion at any moment. The first months of 1942 were strained ones for the West Coast. After the unanticipated attack on Pearl Harbor on December 7, 1941 resulted in the deaths of 2,403 Americans, President Franklin Delano Roosevelt asked Congress to declare war and join the Allied Powers. At that point, Los Angeles already ranked first of all cities in America in production of aircraft, and the city’s San Pedro Bay housed an enormous naval armada. By October 1941, the shipbuilding industry in the city had jumped to 22,000 employees, up from 1,000 only two years earlier. With its vulnerable location on the Pacific Ocean, and noticeably growing manufacturing centers, Angelenos feared their city might be the next target for Japanese fleets. “We imagined parachutes dropping. We imagined the hills of Hollywood on fire. We imagined hand-to-hand combat on Rodeo Drive,” actor and writer Buck Henry said of the tense atmosphere. Those fears weren’t entirely unfounded. While the Japanese weren’t planning on launching an attack by air—doing so would require bringing their aircraft carriers within range of the U.S. military, risking their loss—they did send submarines. On December 23, 1941, those submarines sank the oil tanker Montebello off California’s coast, and then attacked the lumber ship SS Absaroka the very next day, causing minor damage and killing one crew member. But their real coup came on February 23, when the cruiser submarine I-17, captained by Kozo Nishini, entered the Santa Barbara Channel and began firing on the Ellwood Oil Field, just 10 miles north of Santa Barbara. “It was a real pinprick attack with highly inaccurate gunfire. They only fired between 16 and 24 shells and actually missed a very huge petrol container that would’ve caused major damage,” says historian Mark Felton, author of The Fujita Plan: Japanese Attacks on the United States & Australia During the Second World War, slated to be re-released by Thistle Publishing. Even though the Ellwood attack caused little damage and no loss of life, it succeeded in taking a psychological toll—exactly what the Japanese intended, Felton says. “[The attack] created mass panic along the coast because for the first time the Japanese had actually physically hit the continental U.S., and that happened in the middle of the night. At this point the U.S. has no ability to send aircraft up to deal with that, because they had no radar. It gave the feeling to the American West Coast that they were highly vulnerable.” Those jitters carried into the following days, and around 1:45 a.m. on February 25, the newly developed coastal radar picked up a blip: an unidentified aerial target 120 miles west of Los Angeles and heading straight for the city. By 2:15 two more radar sites confirmed the object, and at 2:25 the city’s air raid warning system went off. Then the shooting began. “Residents from Santa Monica southward to Long Beach, covering a thirty-nine mile arc, watched from rooftops, hills and beaches as tracer bullets, with golden-yellowish tints, and shells like skyrockets offered the first real show of the Second World War on the United States mainland,” the New York Times reported the next day. “I remember my mom being so nervous her teeth were chattering. It was really scary,” said Anne Ruhge to Liesl Bradner of Military History. “We thought it was another invasion.” By 7:21 a.m. the regional warning center finally issued an all clear, and the cleanup began. The incident had indirectly resulted in five casualties, due to car accidents that happened during the blackout and heart attacks caused by shock. Anti-aircraft batteries had fired off more than 1,400 rounds, none of which hit any enemy aircraft: because there hadn’t been any enemy aircraft to begin with. The likeliest explanation for what had appeared on the radar was a stray weather balloon drifting toward land. But in the immediate aftermath, the U.S. Navy and U.S. Army disagreed about what had actually happened, writes John Geoghegan in Operation Storm: Japan’s Top Secret Submarines and Their Plan to Change the Course of World War II. While Secretary of War Henry L. Stimson claimed as many as 15 aircraft had flown over Los Angeles, Navy Secretary Frank Knox said, “As far as I know the whole raid was a false alarm… attributed to jittery nerves.” In the end, no trace of enemy aircraft or soldiers were ever discovered, and the military was forced to admit the “Battle” of Los Angeles was a false alarm. But it did galvanize the city and the military, says Arthur C. Verge, professor of history at El Camino College. “As bad as the Battle of Los Angeles was, I think it was a wake up call. Some people saw [the war] as far way, in the Hawaiian Islands, but now it was real, right next door.” That meant people were more willing to support the military with small actions, like rationing food or selling war bonds. In fact, the false alarm air raid has continued to play a role in the city’s history,  says Stephen Nelson, director and curator of the Fort MacArthur Museum in San Pedro. For the past 15 years, the museum has held an annual reenactment event to commemorate the Great Los Angeles Air Raid, resulting in Nelson spending years doing research for a book on the raid, which he hopes will be published sometime next year. “We started the event because it was something unique we could do to make money. Part of the battle actually occurred on the hillside [where the museum is located] so that’s an original part of our history,” Nelson says. In his research, Nelson spoke with 10 veterans of the war who participated in the air raid, and learned how important the incident was to them. “Almost every one of them said that’s where they got their first experience with battle conditions,” Nelson says. Even if the attack didn’t include any enemy fighters, it still felt as terrifying and important as if it had been real. But the repercussions went far beyond the experience of air wardens pulled into action that night. This “attack” came only days after President Roosevelt’s executive order 9066—the one that authorized the internment of Japanese-Americans. Roosevelt signed it in large part due to fears that Japanese-Americans were collaborating with the Japanese military. “Prior to the raid there was a great deal of suspicion,” Felton says. “The LAPD reported that Japanese citizens had been signaling Japanese aircraft, although there’s no evidence for that.” Lack of evidence, however, made no difference to military generals. By March 2 they had issued a public proclamation dividing California, Washington, Oregon and Arizona into two military zones, with one as a restricted zone from which all people of Japanese ancestry would soon be banned. By the end of the war, nearly 120,000 people—most of them American citizens—had been forcibly removed to internment camps across the country. The last of those camps wasn’t closed until March 1946. “The battle has pretty much been a footnote in history, for at least my lifetime,” Nelson says. “I think it deserves more than that.” Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
49f2910acf7b0f683fbd645dba5eb827
https://www.smithsonianmag.com/history/groundbreaking-exhibit-mount-vernon-slaves-speak-and-history-listens-180960747/
In a Groundbreaking Exhibit at Mount Vernon, Slaves Speak and History Listens
In a Groundbreaking Exhibit at Mount Vernon, Slaves Speak and History Listens You are dining with the President. Frank Lee, standing tall in his red-and-white livery, takes your note of introduction in Mount Vernon’s entry hall. The enslaved butler chooses a spot for you to wait–either in the elegant, robin’s egg blue front parlor, or in the cozier “little parlor”–while he alerts George Washington and wife Martha to your arrival. As the opal haze of a July afternoon rolls off the nearby Potomac River, Lee’s wife, Lucy, labors alongside another enslaved cook, Hercules, to ready dishes for the 3:30 p.m. dinner. Frank, with the aid of waiters Marcus and Christopher Sheels, serves your meal. Around 6 o’clock, they wheel out a silver hot-water urn, and you adjourn to the portico for coffee, tea and conversation with the first family. Above, in a guestroom, enslaved housemaids, like seamstresses Caroline Branham and Charlotte, go about the last tasks of a day begun at dawn. They carry up fresh linens and refill water jugs. Mount Vernon’s enslaved grooms make a last check on the horses. This was how English architect Benjamin Henry Latrobe likely experienced his July 16, 1796 visit to Washington’s estate. During his stay, he sketched the grounds and the people with customary fervor. In Latrobe’s first draft of a painting of his day with President Washington, the silhouette of an enslaved man (possibly Frank Lee) was part of the picture. But in the finished watercolor, he is gone. Lives Bound Together: Slavery at George Washington’s Mount Vernon, a new exhibit at the Virginia estate, on view through 2018, brings Frank, Hercules, Lucy, and other slaves at Mount Vernon to the fore. It’s a project that has been many years in the making. “Our goal was to humanize people,” says Susan P. Schoelwer, Mount Vernon’s Robert H. Smith Senior Curator. “We think of them as individual lives with human dignity.” The exhibition centers on 19 of the 317 enslaved individuals who worked and lived at Mount Vernon during the Washingtons’ lifetime. Mining a rare cache of material culture, artwork, farm tools and plantation records, curators partnered with scholars and descendants of the enslaved to retell their shared past through the stuff of everyday life. “I know that they are speaking again,” says descendant Judge Rohulamin Quander, a member of one of the oldest traceable African-American families in the United States. “Those voices were unsung up until 1799, and we don’t have any pictures or voice recordings of what they had to say. But they have reached out beyond the grave and said to each of us, we’re depending upon you. You have to do this for us.” In his 1799 will, Washington included a slave census and a directive to emancipate his slaves. His decision to do so–which Martha promptly carried out–reflects the nearly seven decades the President spent thinking about slavery’s effects on farming and families. Boldly, Lives Bound Together raises a thorny set of questions: What sort of slave owner was Washington? How and why did his thoughts on slavery change? Records show that George, a slave owner since age 11, brought fewer slaves to his 1759 marriage than Martha. Visitors to Mount Vernon left behind conflicting accounts of Washington’s treatment of his slaves. Whippings and hard labor were frequent forms of reprimand. Yet Washington depended on the enslaved population to take care of his family and secure plantation profits as he took on military and political duties. Often written far from home, some of Washington’s most fascinating correspondence was not with other “founders” but with his farm managers. On New Years’ Day 1789, for example, as the new federal government began to take real shape, Washington turned his attention to Mount Vernon’s needs. He wrote one overseer with clear instructions: “To request that my people may be at their work as soon as it is light—work ’till it is dark—and be diligent while they are at it can hardly be necessary, because the propriety of it must strike every manager who attends to my interest, or regards his own Character—and who on reflection, must be convinced that lost labour can never be regained—the presumption being, that, every labourer (male or female) does as much in the 24 hours as their strength, without endangering their health, or constitution, will allow of.” Despite his mounting responsibilities on the national stage, Washington remained a shrewd businessman. He relied on slaves to keep his Virginia plantation running at a profit, says David Hoth, senior editor at The Papers of George Washington editorial project. “He was inclined to suspect his workers of malingering and petty theft, perhaps because he recognized that they probably saw slavery as an unnatural and unpleasant condition,” says Hoth. “He sold at least one runaway to the West Indies and threatened others.” In private, the president came to support gradual abolition by legislative act and favored measures, like non-importation, that might hasten the change. He pursued Mount Vernon’s runaway slaves, albeit quietly, without using newspaper advertisements. By 1792-93, according to Hoth, George Washington began to mull the idea of emancipation. “It’s important to tell the story of his views on slavery and how they evolved,” says Schoelwer. “He was in the position of trying to balance private concerns with his public commitment to the survival of the nation.” At the same time, he used legal loopholes to make sure his slaves were kept enslaved. The Mount Vernon exhibit collects a diverse medley of African-American sagas that reconsider the 18th-century world’s understanding of slavery and freedom. Via short biographies, reinterpreted artifacts, and new archeological evidence from Mount Vernon’s slave cemetery, 19 lives emerge for new study. A new digital resource, an ever-evolving slavery database, allows visitors to search Mount Vernon’s enslaved community by name, skill or date range. So far, the database has gathered information on 577 unique individuals who lived or worked at Mount Vernon up to 1799, and compiled details on the more than 900 enslaved individuals with whom George Washington interacted during his travels, according to Jessie MacLeod, associate curator at Mount Vernon. But though it shows a thriving plantation, the database also tells a different story. “You really get a sense for how often people are running away,” says MacLeod . “There are casual mentions in the weekly reports, of people being absent sometimes for 3 or 4 days. It’s not always clear whether they came back voluntarily or were captured. There’s no newspaper ad, but we do see an ongoing resistance in terms of absenteeism, and when they’re visiting family or friends in neighboring plantations.” In the museum world, reinterpretation of slavery and freedom has gained new momentum. Mount Vernon’s “Lives Bound Together” exhibit reflects historic sites’ turn to focus on the experience of the enslaved, while exploring the paradox of liberty and slavery in daily life. In recent years, historians at Mount Vernon, along with those at Thomas Jefferson’s Monticello and James Madison’s Montpelier, have rethought how to present those stories to the public through new signage, “slave life” walking tours, and open archeological digs. A series of scholarly conferences--sponsored by institutions like the Omohundro Institute of Early American History and Culture, the National Endowment for the Humanities, the University of Virginia, and many more--have been hosted at the former presidential homes. Latrobe’s portrait of life at Mount Vernon may have initially included the slaves who made Washington’s estate hum, but the finished painting only tells part of that story, Lives Bound Together completes the picture by depicting the shared journey of the Washingtons and the enslaved. “We helped build this place and make it what it is. We helped make the president who he was,” says Shawn Costley, a descendant of Davy and Edy Jones, in the exhibit’s film. “We might not have had voting power and all that back then, but we made that man, we made George Washington, or added to or contributed to him being the prominent person that he is today.” Sara Georgini is series editor for The Papers of John Adams, part of The Adams Papers editorial project at the Massachusetts Historical Society. She is the author of Household Gods: The Religious Lives of the Adams Family.
1861b8910ff45f1a3b47d98c3a6704cc
https://www.smithsonianmag.com/history/hamilton-takes-command-74722445/
Hamilton Takes Command
Hamilton Takes Command “ALEXANDER HAMILTON is the least appreciated of the founding fathers because he never became president,” says Willard Sterne Randall, a professor of humanities at ChamplainCollege in Burlington, Vermont, and the author of Alexander Hamilton: A Life, released this month from HarperCollins Publishers. “Washington set the mold for the presidency, but the institution wouldn’t have survived without Hamilton.” Hamilton was born January 11, 1755, on the island of Nevis in the West Indies, the illegitimate son of James Hamilton, a merchant from Scotland, and Rachel Fawcett Levine, a doctor’s daughter who was divorced from a plantation owner. His unmarried parents separated when Hamilton was 9, and he went to live with his mother, who taught him French and Hebrew and how to keep the accounts in a small dry goods shop by which she supported herself and Hamilton’s older brother, James. She died of yellow fever when Alexander was 13. After her death, Hamilton worked as a clerk in the Christiansted (St. Croix) office of a New York-based import-export house. His employer was Nicholas Cruger, the 25-year-old scion of one of colonial America’s leading mercantile families, whose confidence he quickly gained. And in the Rev. Hugh Knox, the minister of Christiansted’s first Presbyterian church, Hamilton found another patron. Knox, along with the Cruger family, arranged a scholarship to send Hamilton to the United States for his education. At age 17, he arrived in Boston in October 1772 and was soon boarding at the ElizabethtownAcademy in New Jersey, where he excelled in English composition, Greek and Latin, completing three years’ study in one. Rejected by Princeton because the college refused to go along with his demand for accelerated study, Hamilton went instead in 1773 to King’s College (now ColumbiaUniversity), then located in Lower Manhattan. In events leading up to the excerpt that follows, Hamilton was swept up by revolutionary fervor and, at age 20, dropped out of King’s College and formed his own militia unit of about 25 young men. In June 1775, the Continental Congress in Philadelphia chose Virginia delegate Col. George Washington as commander in chief of the Continental Army then surrounding British-occupied Boston. Hurrying north, Washington spent a day in New York City, where, on Sunday, June 25, 1775, Alexander Hamilton braced at attention for Washington to inspect his militiamen at the foot of Wall Street. Two months later, the last hundred British troops withdrew from Manhattan, going aboard the 64-gun man-of-war Asia . At 11 o’clock on the night of August 23, Continental Army Artillery captain John Lamb gave orders for his company, supported by Hamilton’s volunteers and a light infantry unit, to seize two dozen cannons from the battery at the island’s southern tip. The Asia’s captain, having been warned by Loyalists that the Patriots would raid the fort that night, posted a patrol barge with redcoats just offshore. Shortly after midnight, the British spotted Hamilton, his friend Hercules Mulligan, and about 100 comrades tugging on ropes they had attached to the heavy guns. The redcoats opened a brisk musket fire from the barge. Hamilton and the militiamen returned fire, killing a redcoat. At this, the Asiahoisted sail and began working in close to shore, firing a 32-gun broadside of solid shot. One cannonball pierced the roof of FrauncesTavern at Broad and Pearl Streets. Many years later Mulligan would recall: “I was engaged in hauling off one of the cannons, when Mister Hamilton came up and gave me his musket to hold and he took hold of the rope. . . . Hamilton [got] away with the cannon. I left his musket in the Battery and retreated. As he was returning, I met him and he asked for his piece. I told him where I had left it and he went for it, notwithstanding the firing continued, with as much concern as if the [Asia] had not been there.” Hamilton ’s cool under fire inspired the men around him: they got away with 21 of the battery’s 24 guns, dragged them uptown to CityHallPark and drew them up around the Liberty Pole under guard for safekeeping. On January 6, 1776, the New York Provincial Congress ordered that an artillery company be raised to defend the colony; Hamilton, unfazed that virtually all commissions were going to native colonists of wealth and social position, leaped at the opportunity. Working behind the scenes to advance his candidacy, he won the support of Continental Congressmen John Jay and William Livingston. His mathematics teacher at King’s College vouched for his mastery of the necessary trigonometry, and Capt. Stephen Bedlam, a skilled artillerist, certified that he had “examined Alexander Hamilton and judges him qualified.” While Hamilton waited to hear about his commission, Elias Boudinot, a leader of the New Jersey Provincial Congress, wrote from Elizabethtown to offer him a post as brigade major and aide-de-camp to Lord Stirling (William Alexander), commander of the newly formed New Jersey Militia. It was tempting. Hamilton had met the wealthy Scotsman as a student at ElizabethtownAcademy and thought highly of him. And if he accepted, Hamilton would likely be the youngest major in the Revolutionary armies. Then Nathanael Greene, a major general in the Continental Army, invited Hamilton to become his aide-de-camp as well. After thinking the offers over, Hamilton declined both of them, gambling instead on commanding his own troops in combat. Sure enough, on March 14, 1776, the New York Provincial Congress ordered Alexander Hamilton “appointed Captain of the Provincial Company of Artillery of this colony.” With the last of his St. Croix scholarship money, he had his friend Mulligan, who owned a tailor shop, make him a blue coat with buff cuffs and white buckskin breeches. He then set about recruiting the 30 men required for his company. “We engaged 25 men [the first afternoon],” Mulligan remembered, even though, as Hamilton complained in a letter to the provincial congress, he could not match the pay offered by Continental Army recruiters. On April 2, 1776, two weeks after Hamilton received his commission, the provincial congress ordered him and his fledgling company to relieve Brig. Gen. Alexander McDougall’s First New York Regiment, guarding the colony’s official records, which were being shipped by wagon from New York’s City Hall to the abandoned Greenwich Village estate of Loyalist William Bayard. In late May 1776, ten weeks after becoming an officer, Hamilton wrote to New York’s provincial congress to contrast his own meager payroll with the pay rates spelled out by the Continental Congress: “You will discover a considerable difference,” he said. “My own pay will remain the same as it is now, but I make this application on behalf of the company, as I am fully convinced such a disadvantageous distinction will have a very pernicious effect on the minds and behavior of the men. They do the same duty with the other companies and think themselves entitled to the same pay.” The day the provincial congress received Captain Hamilton’s missive, it capitulated to all his requests. Within three weeks, the young officer’s company was up to 69 men, more than double the required number. Meanwhile, in the city, two huge bivouacs crammed with tents, shacks, wagons and mounds of supplies were taking shape. At one of them, at the juncture of present-day Canal and Mulberry Streets, Hamilton and his company dug in. They had been assigned to construct a major portion of the earthworks that reached halfway across ManhattanIsland. Atop Bayard’s Hill, on the highest ground overlooking the city, Hamilton built a heptagonal fort, Bunker Hill. His friend Nicholas Fish described it as “a fortification superior in strength to any my imagination could ever have conceived.” When Washington inspected the works, with its eight 9-pounders, four 3-pounders and six cohorn mortars, in mid-April, he commended Hamilton and his troops “for their masterly manner of executing the work.” Hamilton also ordered his men to rip apart fences and cut down some of the city’s famous stately elm trees to build barricades and provide firewood for cooking. In houses abandoned by Loyalists, his soldiers propped muddy boots on damask furniture, ripped up parquet floors to fuel fireplaces, tossed garbage out windows and grazed their horses in gardens and orchards. One Loyalist watched in horror as army woodcutters, ignoring his protests, chopped down his peach and apple orchards on 23rd Street . Despite a curfew, drunken soldiers caroused with prostitutes in the streets around TrinityChurch. By midsummer, 10,000 American troops had transformed New York City into an armed camp. The very day—July 4, 1776—that the founding fathers of the young nation-to-be were signing the Declaration of Independence in Philadelphia, Captain Hamilton watched through his telescope atop Bayard’s Hill as a forest of ship masts grew ominously to the east; in all, some 480 British warships would sail into New York Harbor. One of Washington’s soldiers wrote in his diary that it seemed “all London was afloat.” Soon they had begun to disgorge the first of what would swell to 39,000 troops—the largest expeditionary force in English history—onto Staten Island. On July 9, at 6 o’clock in the evening, Hamilton and his men stood to attention on the commons to hear the declaration read aloud from the balcony of City Hall. Then the soldiers roared down Broadway to pull down and smash the only equestrian statue of King George III in America. Three days later, British Vice Admiral Lord Richard Howe detached two vessels from his flotilla, the 44-gun Phoenixand the 28-gun Rose, to sail up the Hudson and probe shore defenses. The captain of the Rose coolly sipped claret on his quarterdeck as his vessel glided past the battery on Lower Manhattan—where an ill-trained American gun crew immediately blew itself up. The ships sailed unmolested up the river to Tarrytown as colonial troops abandoned their posts to watch. An appalled Washington fumed: “Such unsoldierly conduct gives the enemy a mean opinion of the army.” On their return, the two British ships passed within cannon range of Hamilton’s company at FortBunker Hill. He ordered his 9-pounders to fire, which the British warships returned. In the brief skirmish, one of Hamilton’s cannons burst, killing one man and severely wounding another. On August 8, Hamilton tore open orders from Washington: his company was to be on round-the-clock alert against an imminent invasion of Manhattan. “The movements of the enemy and intelligence by deserters give the utmost reason to believe that the great struggle in which we are contending for everything dear to us and our posterity, is near at hand,” Washington wrote. But early on the morning of August 27, 1776, Hamilton watched, helpless, as the British ferried 22,000 troops from Staten Island, not to Manhattan at all, but to the village of Brooklyn, on Long Island. Marching quickly inland from a British beachhead that stretched from Flatbush to Gravesend, they met little resistance. Of the 10,000 American troops on Long Island, only 2,750 were in Brooklyn, in four makeshift forts spread over four miles. At Flatbush, on the American east flank, Lord Charles Cornwallis quickly captured a mounted patrol of five young militia officers, including Hamilton’s college roommate, Robert Troup, enabling 10,000 redcoats to march stealthily behind the Americans. Cut off by an 80-yard-wide swamp, 312 Americans died in the ensuing rout; another 1,100 were wounded or captured. By rowboat, barge, sloop, skiff and canoe in a howling northeaster, a regiment of New England fishermen transported the survivors across the East River to Manhattan. At a September 12, 1776, council of war, a grim-faced Washington asked his generals if he should abandon New York City to the enemy. Rhode Islander Nathanael Greene, Washington’s second-in-command, argued that “a general and speedy retreat is absolutely necessary” and insisted, as well, that “I would burn the city and suburbs,” which, he maintained, belonged largely to Loyalists. But Washington decided to leave the city unharmed when he decamped. Before he could do so, however, the British attacked again, at Kip’s Bay on the East River between present- day 30th and 34th Streets, two miles north of Hamilton’s hill fort, leaving his company cut off and in danger of capture. Washington sent Gen. Israel Putnam and his aide-decamp, Maj. Aaron Burr, to evacuate them. The pair reached Fort Bunker Hill just as American militia from Lower Manhattan began to stream past Hamilton heading north on the Post Road (now Lexington Avenue). Although Hamilton had orders from Gen. Henry Knox to rally his men for a stand, Burr, in the name of Washington, countermanded Knox and led Hamilton, with little but the clothes on his back, two cannons and his men, by a concealed path up the west side of the island to freshly dug entrenchments at Harlem Heights. Burr most likely saved Hamilton’s life. The British built defenses across northern Manhattan, which they now occupied. On September 20, fanned by high winds, a fire broke out at midnight in a frame house along the waterfront near Whitehall Slip. Four hundred and ninety-three houses—one-fourth of the city’s buildings—were destroyed before British soldiers and sailors and townspeople put out the flames. Though the British accused Washington of setting the fire, no evidence has ever been found to link him to it. In a letter to his cousin Lund at Mount Vernon, Washington wrote: “Providence, or some good honest fellow, has done more for us than we were disposed to do for ourselves.” By mid-October, the American army had withdrawn across the Harlem River north to White Plains in Westchester County. There, on October 28, the British caught up with them. Behind hastily built earthworks, Hamilton’s artillerymen crouched tensely as Hessians unleashed a bayonet charge up a wooded slope. Hamilton’s gunners, flanked by Maryland and New York troops, repulsed the assault, causing heavy casualties, before being driven farther north. Cold weather pinched the toes and numbed the fingers of Hamilton’s soldiers as they dug embankments. His pay book indicates he was desperately trying to round up enough shoes for his barefoot, frostbitten men. Meanwhile, an expected British attack did not materialize. Instead, the redcoats and Hessians stormed the last American stronghold on ManhattanIsland, FortWashington, at present-day 181st Street, where 2,818 besieged Americans surrendered on November 16. Three days later, the British force crossed the Hudson and attacked Fort Lee on the New Jersey shore near the present-day GeorgeWashingtonBridge. The Americans escaped, evacuating the fort so quickly they left behind 146 precious cannons, 2,800 muskets and 400,000 cartridges. In early November, Captain Hamilton and his men had been ordered up the Hudson River to Peekskill to join a column led by Lord Stirling. The combined forces crossed the Hudson to meet Washington and, as the commander in chief observed, his 3,400 “much broken and dispirited” men, in Hackensack, New Jersey. Hamilton hitched horses to his two remaining 6-pound guns and marched his gun crews 20 miles in one day to the RaritanRiver. Rattling through Elizabethtown, he passed the ElizabethtownAcademy where, only three years earlier, his greatest concern had been Latin and Greek declensions. Dug in near Washington’s Hackensack headquarters on November 20, Hamilton was startled by the sudden appearance of his friend Hercules Mulligan, who, to Hamilton’s great dismay, had been captured some three months earlier at the Battle of Long Island. Mulligan had been determined a “gentleman” after his arrest and released on his honor not to leave New York City. After a joyous reunion, Hamilton evidently persuaded Mulligan to return to New York City and to act, as Mulligan later put it, as a “confidential correspondent of the commander-in-chief”—a spy. After pausing to await Gen. Sir William Howe, the British resumed their onslaught. On November 29, a force of about 4,000, double that of the Americans, arrived at a spot across the Raritan River from Washington’s encampment. While American troops tore up the planks of the NewBridge, Hamilton and his guns kept up a hail of grapeshot. For several hours, the slight, boyish-looking captain could be seen yelling, “Fire! Fire!” to his gun crews, racing home bags of grapeshot, then quickly repositioning the recoiling guns. Hamilton kept at it until Washington and his men were safely away toward Princeton. Halfway there, the general dispatched a brief message by express rider to Congress in Philadelphia: “The enemy appeared in several parties on the heights opposite Brunswick and were advancing in a large body toward the [Raritan] crossing place. We had a smart cannonade whilst we were parading our men.” Washington asked one of his aides to tell him which commander had halted his pursuers. The man replied that he had “noticed a youth, a mere stripling, small, slender, almost delicate in frame, marching, with a cocked hat pulled down over his eyes, apparently lost in thought, with his hand resting on a cannon, and every now and then patting it, as if it were a favorite horse or a pet plaything.” Washington’s stepgrandson Daniel Parke Custis later wrote that Washington was “charmed by the brilliant courage and admirable skill” of the then 21-year-old Hamilton, who led his company into Princeton the morning of December 2. Another of Washington’s officers noted that “it was a model of discipline; at their head was a boy, and I wondered at his youth, but what was my surprise when he was pointed out to me as that Hamilton of whom we had already heard so much.” After losing New Jersey to the British, Washington ordered his army into every boat and barge for 60 miles to cross the Delaware River into Pennsylvania’s BucksCounty. Ashivering Hamilton and his gunners made passage in a Durham ore boat, joining artillery already ranged along the western bank. Whenever British patrols ventured too near the water, Hamilton’s and the other artillerymen repulsed them with brisk fire. The weather grew steadily colder. General Howe said he found it “too severe to keep the field.” Returning to New York City with his redcoats, he left a brigade of Hessians to winter at Trenton. In command of the brigade, Howe placed Col. Johann Gottlieb Rall, whose troops had slaughtered retreating Americans on Long Island and at FortWashington on Manhattan. His regiments had a reputation for plunder and worse. Reports that the Hessians had raped several women, including a 15-year-old girl, galvanized New Jersey farmers, who had been reluctant to help the American army. Now they formed militia bands to ambush Hessian patrols and British scouting parties around Trenton. “We have not slept one night in peace since we came to this place,” one Hessian officer moaned. Washington now faced a vexing problem: the enlistments of his 3,400 Continental troops expired at midnight New Year’s Eve; he decided to attack the Trenton Hessians while they slept off the effects of their Christmas celebration. After so many setbacks, it was a risky gambit; defeat could mean the end of the American cause. But a victory, even over a small outpost, might inspire lagging Patriots, cow Loyalists, encourage reenlistments and drive back the British—in short, keep the Revolution alive. The main assault force was made up of tested veterans. Henry Knox, Nathanael Greene, James Monroe, John Sullivan and Alexander Hamilton, future leaders of America’s republic, huddled around a campfire at McKonkey’s Ferry the frigid afternoon of December 25, 1776, to get their orders. Hamilton and his men had blankets wrapped around them as they hefted two 6-pounders and their cases of shot and shells onto the 9-foot-wide, 60- foot-long Durham iron-ore barges they had commandeered, then pushed and pulled their horses aboard. Nineteen-yearold James Wilkinson noted in his journal that footprints down to the river were “tinged here and there with blood from the feet of the men who wore broken shoes.” Ship cap tain John Glover ordered the first boatloads to push off at 2 a.m. Snow and sleet stung Hamilton’s eyes. Tramping past darkened farmhouses for 12 miles, Hamilton’s company led Nathanael Greene’s division as it swung off to the east to skirt the town. One mile north of Trenton, Greene halted the column. At precisely 8 in the morning, Hamilton unleashed his artillery on the Hessian outpost. Three minutes later, American infantry poured into town. Driving back Hessian pickets with their bayonets, they charged into the old British barracks to confront groggy Hessians at gunpoint. Some attempted to regroup and counterattack, but Hamilton and his guns were waiting for them. Firing in tandem, Hamilton’s cannons cut down the Hessians with murderous sheets of grapeshot. The mercenaries sought cover behind houses but were driven back by Virginia riflemen, who stormed into the houses and fired down from upstairs windows. Hessian artillerymen managed to get off only 13 rounds from two brass fieldpieces before Hamilton’s gunners cut them in two. Riding back and forth behind the guns, Washington saw for himself the brutal courage and skillful discipline of this youthful artillery captain. The Hessians’ two best regiments surrendered, but a third escaped. As the Americans recrossed the Delaware, both they and their prisoners, nearly 1,000 in all, had to stomp their feet to break up the ice that was forming on the river. Five men froze to death. Stung by the defeat, British field commander Lord Cornwallis raced across New Jersey with battle-seasoned grenadiers to retaliate. Americans with $10 gold reenlistment bonuses in their pockets recrossed the river to intercept them. When the British halted along a three-mile stretch of Assunpink Creek outside Trenton and across from the Americans, Washington duped British pickets by ordering a rear guard to tend roaring campfires and to dig noisily through the night while his main force slipped away. At 1 a.m., January 2, 1777, their numbers reduced from 69 to 25 by death, desertion and expired enlistments, Hamilton and his men wrapped rags around the wheels of their cannons to muffle noise, and headed north. They reached the south end of Princeton at sunrise, to face a brigade—some 700 men—of British light infantry. As the two forces raced for high ground, American general Hugh Mercer fell with seven bayonet wounds. The Americans retreated from a British bayonet charge. Then Washington himself galloped onto the battlefield with a division of Pennsylvania militia, surrounding the now outnumbered British. Some 200 redcoats ran to Nassau Hall, the main building at PrincetonCollege. By the time Hamilton set up his two cannons, the British had begun firing from the windows of the red sandstone edifice. College tradition holds that one of Hamilton’s 6-pound balls shattered a window, flew through the chapel and beheaded a portrait of King George II. Under Hamilton’s fierce cannonade, the British soon surrendered. In the wake of twin victories within ten days, at Trenton and Princeton, militia volunteers swarmed to the American standard, far more than could be fed, clothed or armed. Washington’s shorthanded staff was ill-equipped to coordinate logistics. In the four months since the British onslaught had begun, 300 American officers had been killed or captured. “At present,” Washington complained, “my time is so taken up at my desk that I am obliged to neglect many other essential parts of my duty. It is absolutely necessary for me to have persons [who] can think for me as well as execute orders. . . . As to military knowledge, I do not expect to find gentlemen much skilled in it. If they can write a good letter, write quick, are methodical and diligent, it is all I expect to find in my aides.” He would get all that and more. In January, shortly after the army was led into winter quarters at Morristown, New Jersey, Nathanael Greene invited Hamilton, who had just turned 22, to dinner at Washington’s headquarters. There, Washington invited the young artillery officer to join his staff. The appointment carried a promotion from captain to lieutenant colonel, and this time Hamilton did not hesitate. On March 1, 1777, he turned over the command of his artillery company to Lt. Thomas Thompson—a sergeant whom, against all precedent, he had promoted to officer rank—and joined Washington’s headquarters staff. It would prove a profound relationship. “During a long series of years, in war and in peace, Washington enjoyed the advantages of Hamilton’s eminent talents, integrity and felicity, and these qualities fixed [Hamilton] in [Washington’s] confidence to the last hour of his life,” wrote Massachusetts Senator Timothy Pickering in 1804.Hamilton, the impecunious abandoned son, and Washington, the patriarch without a son, had begun a mutually dependent relationship that would endure for nearly 25 years— years corresponding to the birth, adolescence and coming to maturity of the United States of America. Hamilton would become inspector general of the U.S. Army and in that capacity founded the U.S. Navy. Along with James Madison and John Jay, he wrote the Federalist Papers, essays that helped gain popular support for the then-proposed Constitution. In 1789, he became the first Secretary of the Treasury, under President Washington and almost single-handedly created the U.S. Mint, the stock and bond markets and the concept of the modern corporation. After the death of Washington on December 14, 1799, Hamilton worked secretly, though assiduously, to prevent the reelection of John Adams as well as the election of Thomas Jefferson and Aaron Burr. Burr obtained a copy of a Hamilton letter that branded Adams an “eccentric” lacking in “sound judgment” and got it published in newspapers all over America. In the 1801 election, Jefferson and Burr tied in the Electoral College, and Congress made Jefferson president, with Burr his vice president. Hamilton, his political career in tatters, founded the New York Evening Post newspaper, which he used to attack the new administration. In the 1804 New York gubernatorial election, Hamilton opposed Aaron Burr’s bid to replace Governor George Clinton. With Hamilton’s help, Clinton won. When he heard that Hamilton had called him “a dangerous man, and one who ought not to be trusted with the reins of government,” Burr demanded a written apology or satisfaction in a duel. On the morning of Thursday, July 11, 1804, on a cliff in Weehawken, New Jersey, Hamilton faced the man who had rescued him 28 years earlier in Manhattan. Hamilton told his second, Nathaniel Pendleton, that he intended to fire into the air so as to end the affair with honor but without bloodshed. Burr made no such promise. Ashot rang out. Burr’s bullet struck Hamilton in the right side, tearing through his liver. Hamilton’s pistol went off a split second later, snapping a twig overhead. Thirty-six hours later, Alexander Hamilton was dead. He was 49 years old.
268c93199080c58006659a91c60de66e
https://www.smithsonianmag.com/history/heartbreaking-history-of-divorce-180949439/
The Heartbreaking History of Divorce
The Heartbreaking History of Divorce Each Valentine’s Day, I start off feeling happy. My contentment grows as my husband and I put our five children to bed and we enjoy a quiet dinner in the kitchen. I’m still happy when we plop ourselves onto the sofa for an hour of television before bedtime. But then my mood changes and I can’t help thinking about divorce. I don’t mean for me. It’s the shows we watch. The romantic twists and miserable turns of the characters; their many heartbreaks and only occasional highs reflect a deeper truth about modern life. The fact is, in the United States the probability of a first marriage lasting for 20 years has decreased to about 50-50. (Before anyone blames Western decadence for the breakdown of the family, it should be pointed out that the Maldives occupies the number one spot in the divorce league tables, followed by Belarus. The United States is third.) Furthermore, these grim statistics don’t even touch on the reality that for an increasing percentage of the population, life is a series of short cohabitations punctuated by the arrival of children. For a country that makes such a fuss about love on the 14th of February, America has a funny way of showing it on the other 364 days of the year. This may be my XX chromosomes doing the talking, but it seems to me that divorce is, and always has been, a women’s issue par excellence. Multiple studies have shown that women bear the brunt of the social and economic burdens that come with divorce. The quickest route to poverty is to become a single mother. This is awful enough, but what I find so galling is that the right to divorce was meant to be a cornerstone of liberty for women. For centuries, divorce in the West was a male tool of control—a legislative chastity belt designed to ensure that a wife had one master, while a husband could enjoy many mistresses. It is as though, having denied women their cake for so long, the makers have no wish to see them enjoy it. There is no point trying to pin down where things went wrong for women because, when it comes to divorce, it’s not clear that things were ever right. Still, that shouldn’t prevent us from exploring how the modern concept of a legal divorce came into being, or from dismantling many of the myths that surround the history of divorce. The most celebrated divorce case in history remains that of Henry VIII versus Pope Clement VII. The battle began in 1527, when Henry tried to force the pope into annulling his marriage to Catherine of Aragon, who had failed to provide him with a male heir. Determined to make the younger and prettier Anne Boleyn his wife, Henry finally broke with Rome in 1533 and declared himself the head of a new church, the Church of England. The collateral damage from Henry’s unilateral decision was a way of life that stretched back for more than a thousand years. Gone forever was not just a system of patronage or the ancient rites, but the vast network of religious schools, hospitals, convents and monasteries that maintained the social fabric of the country. If Helen’s face is said to have launched a thousand ships, then Anne’s closed a thousand churches. Yet her ascendancy over Henry did not survive the stillbirth of a male heir. A mere three years after the controversial marriage, Anne was convicted of treason, adultery and incest, and beheaded. Her enemies were legion by the time of her death, and even today some still regard her as the original home-wrecker, the woman whose unbridled social ambition destroyed the sanctity of marriage. It is generally assumed that she caused the floodgates of divorce to be opened in England, never to be closed again. As with most assumptions, appearances can be deceiving. Henry’s marriage to Anne led to precisely one divorce—in 1552. The term was not even used again until 1670. In fact, while Protestant Europe was beginning to embrace the idea that there could indeed be justifiable reasons for ending a marriage, England actually made a lurch backward. Not only did Henry VIII’s new church come out against divorce under any circumstances, but it also far outstripped Catholic Europe in the restrictions on the granting of annulments. The liberal consanguinity rules of cousinhood, for example, which allowed even distantly related couples to part, were scrapped entirely. The Church of England’s resistance to divorce was so strong that the only route to a divorce was via an act of Parliament—a law voted through by both houses. Not surprisingly, few people had the means or inclination to expose their private unhappiness to the press, the public and 800-odd politicians. When a divorce law was finally enacted in 1857, and the “floodgates” were opened, the number of divorces in English history stood at a mere 324. Only four of the 324 cases were brought by women. A husband needed to prove adultery to obtain a divorce. By contrast, a wife was required to prove adultery and some other especially aggravating circumstance to have the same grounds. Over the years, women learned that brutality, rape, desertion and financial chicanery did not count. In fact, Parliament seemed hard pressed to say what did, until Jane Addison launched her case in 1801. She won on the basis of Mr. Addison’s adultery and incest with her sister in the marital home. Before Mrs. Addison’s successful suit, the best a woman could hope for was a legal separation. Such arrangements were under the jurisdiction of the church courts. Litigants of either sex could sue for separation on the basis of life-threatening cruelty or adultery. Women who obtained a divortium a mensa et thoro (separation from bed and board) could live apart from their husbands, often on an allowance fixed by the court. The process was expensive and tortuous—hence there were only a few dozen cases a year—and at the end, no matter what the grounds for the separation, a wife was still required to be chaste and obedient to her husband. Unless there were truly extenuating circumstances, she could expect to lose custody of her children, too. The paucity of options available to women did not mean that they simply stopped trying. The grounds for annulment included inability to consummate the marriage. The sheer ordeal of providing proof—the wife was always subjected to physical examinations of the most intrusive kind—was enough to deter most women. But in 1561, Willmott Bury of Devon requested an annulment on the grounds that her husband, John, was physically incapable of consummating the marriage. The examining midwives agreed that Mrs. Bury was a virgin, and a physician testified that a kick from a horse had left Mr. Bury with just one testicle, the size of a tiny bean. The court duly granted an annulment. Unfortunately, on his release from Willmott, John married again and fathered a son. Matters came to a head when the next in line to inherit Bury’s estate challenged the validity of the annulment, and tried to have the son proclaimed illegitimate. The suit ultimately failed. The embarrassment caused by the Bury case led to a far stricter interpretation of the rules, including the new stipulation that if an ex-husband suddenly “found” his potency, the annulment became invalid. Nevertheless, in 1613, Frances, Countess of Essex, and her family cited impotency in their nullity suit against the Earl of Essex. As the countess’ father put it, “the Earl had no ink in his pen.” Essex did not dispute the fact that the marriage had never been consummated. But, eager to avoid dishonor and humiliation, he claimed that the difficulty was only with Frances. Aristocratic society did not know what to make of the case. Meanwhile, Frances had fallen in love with King James I’s favorite courtier, the Earl of Somerset. She was desperate to marry him, and prepared to do anything to win her case—a dangerous state of affairs that would come back to haunt her. Frances’ lawyers believed they had found a solution in the form of an obscure pronouncement by the 13th-century saint Thomas Aquinas. According to Aquinas, a man could be rendered temporarily impotent if witchcraft were involved. The Earl of Essex, claimed Frances’ lawyers, had been the victim of malevolence by a person or persons unknown. An annulment was therefore possible with all honor intact. Few people were taken in by the Aquinas argument, and certainly not the Archbishop of Canterbury, who headed the panel of ten judges. But Frances and Somerset had a powerful ally in the form of the king. The suit was granted by a majority vote, and the couple were married in December 1613 in the society wedding of the year. This was not the end of the story, however. Two years later, the king received a letter that he could not ignore. It accused Frances of having poisoned Sir Thomas Overbury, one of the loudest critics against the annulment, who conveniently died just ten days before the court decision. If that were not damaging enough, Overbury had died while a prisoner in the Tower of London—sent there on the orders of the king. Behind the obvious scandal lay a possible conspiracy that reached all the way to the throne. Suspects were rounded up with bewildering speed. Frances was arrested and pleaded guilty to attempted murder. The disgraced couple was permanently banished to the country, where they lived out their days in bitterness and mutual recrimination. The Essex affair had a dampening effect on annulment suits. Subsequent litigants invariably failed unless they had an incontrovertible case involving, for example, two women and a deception, such as the 1680 suit of Arabella Hunt, who thought she married “James Howard” only to discover “he” was a woman named Amy Poulter. A woman married to a castrato could also claim valid grounds, as in the doomed 1766 love affair between Dorothea Maunsell and the Italian opera singer Giusto Ferdinando Tenducci. This left two grounds open to women: bigamy and being underage at the time of the marriage. Both were easy to prove and surprisingly common until the 1753 Marriage Act established a set of rules for the performing and recording of marriages. Before then, a woman married to a scoundrel could only hope that he had a secret marriage somewhere in his past. In 1707, Barbara Villiers, one of Charles II’s favorite mistresses, was rescued from years of misery after she discovered that her husband of two years was already married. Barbara had been long pensioned off with a handsome allowance and the title of Duchess of Cleveland when, at the age of 64, she fell for a man ten years younger named Robert “Beau” Fielding. She married him on November 25, 1705, despite his reputation as one of London’s worst rakes. But what Barbara did not know was that two weeks earlier, Fielding had married Anne Deleau, a widow with a fortune of £60,000. Fielding kept the deception going for six months until he discovered that an even greater deception had been practiced on him. “Anne Deleau” was actually Mary Wadsworth, a friend of the real Anne Deleau’s hairdresser. Fielding turned his rage on the Duchess of Cleveland, beating her so badly that she jumped through a window to escape his violence. She brought a successful suit against him in December, by which time he had already run through a great deal of her money and seduced her granddaughter, leaving her pregnant with his son. Since the hideous violence Fielding inflicted on Barbara would not, in itself, have been sufficient to secure a divorce, it raises the question whether there was ever a case so extreme that the courts intervened. The answer is just once, but not in the manner traditionally associated with divorce. In April 1631, a grand jury indicted the Earl of Castlehaven on the capital charges of rape and sodomy. The list of his alleged crimes included hiring his male lovers as his servants and giving them full control of the household, marrying off his eldest daughter to one of his lover/servants, colluding in the seduction of his adolescent stepdaughter, and finally, holding down his wife while she was raped by one of his servants. Castlehaven’s chief defense was that a wife’s body belonged to her husband, to dispose of as he saw fit. According to English law, the prosecutors could not disagree with the first part of his statement, but they rejected the logical conclusion of the latter. The earl was sentenced to death. Castlehaven was beheaded on May 14, 1631, almost exactly 100 years after the execution of Anne Boleyn. The irony was that in both cases, death had been easier to achieve than divorce. Contrary to popular belief, Henry VIII did not divorce any of his wives. He had sought an annulment from Catherine of Aragon—which he finally awarded to himself after the pope’s continued refusal. When it came to Anne’s turn, Henry took the easy route by having her found guilty of treason. Two days before her execution he became anxious and ordered his bishops to decree an annulment as well. Henry did not like to think of himself as a wife killer. If Anne Boleyn was guilty of starting any sort of trend, it was in adding new significance to the line “till death do you part.” Amanda Foreman is the award-winning author of Georgiana: Duchess of Devonshire and A World on Fire: Britain's Crucial Role in the American Civil War. Her next book The World Made by Women: A History of Women from the Age of Cleopatra to the Era of Thatcher, is slated for publication by Random House (US) and Allen Lane (UK) in 2015.
3a350e93c1716c14ee594bd3a7af4429
https://www.smithsonianmag.com/history/hidden-history-anna-murray-douglass-180968324/
The Hidden History of Anna Murray Douglass
The Hidden History of Anna Murray Douglass “The story of Frederick Douglass’ hopes and aspirations and longing desire for freedom has been told—you all know it. It was a story made possible by the unswerving loyalty of Anna Murray.” So began Rosetta Douglass Sprague, daughter of Anna and Frederick Douglass, in a speech delivered in 1900 that later became the book My Mother As I Recall Her. It remains one of the few works that focuses on Anna Murray Douglass, in contrast to the hundreds that have been written on Frederick Douglass and his legacy. That neglect is in part due to the paucity of materials available on Anna; she was largely illiterate and left behind few physical traces of her life, whereas Frederick wrote thousands of letters and multiple books. But without Anna, Frederick may never have achieved such fame for his abolitionism—or even escaped slavery. Frederick and Anna met in 1838, when he still went by the surname Bailey and she by Murray. The daughter of enslaved parents in rural Maryland around 1813, Anna was the first of her siblings to be born free after her parents were manumitted. She lived with her parents until the age of 17, at which point she headed for Baltimore and found work as a domestic helper. Over the years she managed to earn and save money; the vibrant community of more than 17,000 free blacks in the Maryland city organized black churches and schools despite repressive laws restricting their freedoms. When she met Frederick—historians disagree on the when and where their acquaintance occurred, but it may have been in attending the same church—she was financially prepared to start a life with him. But first, he needed freedom. By borrowing a freedman’s protection certificate from a friend and wearing the disguise of a sailor sewn by Anna, Frederick made his way to New York City by train (possibly spending Anna’s money to buy the ticket, says historian Leigh Fought). Once there, he sent for Anna and they were married in the home of abolitionist David Ruggles. According to Rosetta, Anna brought nearly everything the couple needed to begin their life together: a feather bed with pillows and linens; dishes with cutlery; and a full trunk of clothing for herself. “It was a leap of faith on her part, but there’s not many free black men to marry, and even that could be precarious,” says Fought, the author of Women in the World of Frederick Douglass and professor of history at Le Moyne College. “If she marries Frederick and goes north, she might be working, but she’s got a husband who’s free and in the North there are schools and their children can be educated.” The two settled into a small home in New Bedford, Massachusetts, and both continued working menial tasks or housekeeping until Anna began having children. The first four were all born in New Bedford, including Rosetta, Lewis, Charles and Frederick Jr. Meanwhile, Frederick was becoming ever more involved in the abolition movement, and before long, he was traveling extensively to give speeches—including a two-year stint in England from 1845 to 1847—with Anna left alone to raise and support the family. During that time, she managed to save everything he sent back and used only her own income from mending shoes to support the family. Having the wife act as the family financial planner was common for the period, Fought says. “Within working class households there’s going to be more egalitarian management of the money, and women kept the household books.” This was especially important for the Douglass family, since Frederick was away from home so frequently. Upon Frederick’s return from England in 1847, he moved the family from Massachusetts to Rochester, New York, where they would play host to innumerable guests involved in the anti-slavery movement, and hide runaways on the Underground Railroad. Frederick also began publication of The North Star, an anti-slavery newspaper. But Frederick’s increasing fame and visibility came with difficulties for Anna beyond the danger inherent with operating a stop on the Railroad and having a husband who drew the ire of slavers. In addition to the hidden guests, the Douglass home also played host to a number of Frederick’s colleagues, including two white European women. Julia Griffiths, a English woman who helped with The North Star, lived in the Douglass household for two years, occasionally commenting on the lowly nature of Anna’s work. “Poor fellow!” she wrote in one letter in reference to Frederick. “The quiet & repose he so much needs are very difficult for him to attain in his domestic circle.” Another houseguest, German Ottilie Assing, had numerous unkind things to say of Anna. Frederick’s close affiliation with both these women only added fuel to the fire of rumormongering that followed the family. He was accused of having affairs with both, in part to discredit his work as an abolitionist and in part because of stereotypes of the day about the infidelity of African-American men. For Anna to defend herself would’ve required abandoning the privacy of their home life that was such a privilege for an African-American woman of the era. “Frederick is very circumspect about mentioning Anna [in his writing] because he’s trying to respect her,” Fought says. “Women weren’t supposed to appear in print. You appeared in print when you got married and when you died. Something had gone wrong in your life you appeared in print at other times.” To respond publicly to rumors about her husband would send Anna down a road she didn’t want to be on, Fought explains, and chip away at her respectability. For Rose O’Keefe, author of Frederick & Anna Douglass in Rochester, NY, Anna doesn’t get the credit she deserves. “They say she held the household together, but there was so much more to it than that,” O’Keefe says. Anna would’ve been working constantly to manage the guests, keep the house clean, tend the garden, balance the varying opinions of her husband’s colleagues without getting caught in the middle, and keeping their work on the Underground Railroad secret. “It was a tough role, a very tough role.” And there were plenty of personal low points in her life as well. Frederick was forced to flee the country in 1859 after John Brown’s Harpers Ferry raid to avoid being arrested under the charge that he’d assisted in the attack (though he hadn’t). The couple’s youngest daughter, Annie, died in 1860 at age 10, and the family home in Rochester was burned down (likely due to arson) in 1872. The Douglasses lost over $4,000 worth of goods in the fire, as well as the only complete set of the North Star and Frederick’s later news publications. After the fire, Anna and Frederick moved to Washington, D.C. While Frederick continued his work, Anna continued managing the home, now with occasional help from Rosetta, as well as numerous relatives and grandchildren. She died in 1882 after a series of strokes, leaving behind a legacy that few people ever thought to explore. “People judge Anna to not be good enough for their great, darling Douglass,” Fought says. “Some of it is racially prejudiced because she’s darker skinned. They don’t believe she’s pretty enough.” But even though she left only the slightest mark on the written record of the past, Fought argues that there are still ways to understand some of what her life was like and who she was. “[People like Anna] did leave an impression on the historical record by doing things. You have to be quiet and listen to the choice they made and understand the context and the other possible choices they had,” Fought says. “In that empathy, we understand more about their lives. Often you don’t get them, but you get the outlines of where they were, and an idea of what going through their life would’ve been like.” For Anna, it was a life of working in the background and often being held to unfair standards. But it was also a life of freedom, and numerous children who had the advantage of an education, and who continued coming to her for advice and solace until the end of her life. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
e10dde623c3718dc3271e14fa1549a41
https://www.smithsonianmag.com/history/hiroshima-usa-169079615/
Hiroshima, U.S.A.
Hiroshima, U.S.A. There’s no city that Americans fictionally destroy more often than New York. New York has been blown up, beaten down and attacked in every medium imaginable throughout the 19th and 20th centuries. From movies to novels to newspapers, there’s just something so terribly apocalyptic in the American psyche that we must see our most populous city’s demise over and over again. Before WWII, these visions of New York’s destruction took the form of tidal waves, fires or giant ape attacks — but after the United States dropped two atomic bombs on Japan in Hiroshima and Nagaski, the atom was suddenly the new leveler of cities. The August 5, 1950 cover of Collier’s magazine ran an illustration of a mushroom cloud over Manhattan, with the headline reading: “Hiroshima, U.S.A.: Can Anything be Done About It?” Written by John Lear, with paintings by Chesley Bonestell and Birney Lettick, Collier’s obliterates New York through horrifying words and pictures. The first page of the article explains “the story of this story”: For five years now the world has lived with the dreadful knowledge that atomic warfare is possible. Since last September, when the President announced publicly that the Russians too had produced an atomic explosion, this nation has lived face to face with the terrifying realization that an attack with atomic weapons could be made against us. But, until now, no responsible voice has evaluated the problem constructively, in words everybody can understand. This article performs that service. Collier’s gives it more than customary space in the conviction that, when the danger is delineated and the means to combat it effectively is made clear, democracy will have an infinitely stronger chance to survive. The illustrator who painted the cover was Chesley Bonestell and it is no doubt one of the most frightening images to ever grace the cover of a major American magazine. Opening up to the story inside, we see a city aflame. A kind of wire service ticker tape runs across the top of the images inside the magazine: BULLETIN NOTE TO EDITORS — ADVISORY ONLY — NEWARK NJ — HUGE EXPLOSION REPORTED IN LOWER NEW YORK CITY. IMMEDIATE CONFIRMATION UNAVAILABLE. WIRE CONNECTIONS WITH MANHATTAN ARE DOWN. NEW YORK HAS ADVISED IT WILL FILE FROM HERE SHORTLY . . . BULLETIN — HOBOKEN NJ — DOCK WORKERS ON THE NEW JERSEY SIDE OF THE HUDSON RIVER THIS AFTERNOON REPORTED A THUNDEROUS EXPLOSION IN THE DIRECTION OF NEW YORK CITY. THEY SAID THEY SAW A TREMENDOUS BALL OF FIRE RISING INTO THE SKY The first few pages of the article tell the story of a typical Tuesday in New York City, with people going about their business. Suddenly a radiant heat is felt and a great flash engulfs the city. People in Coney Island mistake it for a lightning bolt. A housewife in the Bronx goes to the kitchen window to investigate where the light came from, only to have the window smash in front of her, sending thousands of “slashing bits” toward her body. As Lear describes it, it doesn’t take long for “millions of people, scattered over thousands of miles” to discover what has taken place. The aftermath is one of great panic with emergency vehicles unable to move and people rushing to find transportation. Collier’s would touch on this theme of urban panic a few years later in their August 21, 1953 issue. One of the many fictional characters we follow in this story (an Associated Press reporter named John McKee) somehow manages to hail a cab in all this madness. McKee eventually gets to his office and begins reading the bulletins: (NR) New York — (AP) — An A-bomb fell on the lower East Side of Manhattan Island at 5:13 P.M. (edt) today — across the East River from the Brooklyn Navy Yard. The story goes on to describe how news coverage is largely crippled by the fact that 16 telephone exchanges were out, leaving 200,000 telephones useless. Ham radios, naturally, come to the rescue in their ability to spread emergency messages. The cover ran almost 5 years to the day of the U.S. bombing of Hiroshima on August 6, 1945. The military was able to go in after the attack and measure the extent of the devastation. The graphs below, which ran with the Collier’s article, explain what kind of impact would be felt at various distances from ground zero. The article explained that our understanding of what a nuclear attack on New York would look like came straight from U.S. measurements in Japan: The opening account of an A-bombing of Manhattan Island may seem highly imaginative. Actually, little of it is invention. Incidents are related in circumstances identical with or extremely close to those which really happened elsewhere in World War II. Property damage is described as it occurred in Hiroshima and Nagasaki, with allowance for differences between Oriental and Occidental standards of building. Death and injury were computed by correlating Census Bureau figures on population or particular sections of New York with Atomic Energy Commission and U.S. Strategic Bombing Survey data on the two A-bombs that fell on Japan. Every place and name used is real. This Collier’s article wasn’t the first to warn of the devastating effect an atomic bomb could have on New York. A four-part series ran in newspapers across the country in April of 1948 which also described how awful a nuclear attack on New York could be. Written by S. Burton Heath, the first article in the series ran with the headline, “One A-Bomb Dropped In New York Would Take 800,000 Lives.” One atomic bomb, exploded over New York’s Times Square on a working day, could be expected to kill several hundred thousand men, women and children. No reputable atomic expert, in Washington or elsewhere, will estimate the exact number. The New York fire department says 100,000. On the basis of Hiroshima and Nagasaki it would be more than 800,000. The most reliable experts say the fire department’s guess is absurdly low. They think the bigger figure is too high. After the surreal devastation that we witnessed during the terrorist attacks upon New York on September 11, 2001 ,we have some idea of what true horror looks like when inflicted upon a major American city. But a nuclear bomb is still something altogether different. The level of destruction that would result from nuclear warfare remains an abstraction for many — until you flip through old magazines of the Cold War. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
258c4513aef7e80d5df57efd79dacee9
https://www.smithsonianmag.com/history/his-patriotic-birthday-five-facts-about-calvin-coolidge-180969516/
For His Patriotic Birthday, Five Facts About Calvin Coolidge
For His Patriotic Birthday, Five Facts About Calvin Coolidge This week the skies will erupt with fireworks honoring the anniversary of our nation’s independence. But a few lone sparklers may flicker for another patriotic cause: the birth of President Calvin Coolidge on July 4, 1872. Silent Cal was most known for his brevity. Though perhaps apocryphal, one particularly infamous incident recounts a White House dinner guest smugly informing Coolidge she’d made a bet that she could get more than two words out of him. His single retort? “You lose.” Even his parting gift to the word was verbally frugal: a last will and testament comprised of only 23 words. What Coolidge lacked in words, however, he made up for in many other ways. Here are five things you may not have known about our 30th president: Raised on a secluded farm in Plymouth Notch, Vermont, Coolidge took comfort in being surrounded by wildlife. He and his wife, Grace, owned pets both wild and domesticated: dogs, cats, birds and raccoons were among the many species that overran the White House during Coolidge’s tenure, terrorizing milkmen and baffling the Secret Service. Just before Coolidge’s inauguration in 1925, Edmund Starling, Coolidge’s Secret Service Chief, moseyed down into the basement to find his new charge trying to stuff a black cat into a crate with a rooster. Coolidge was pithy as ever: he just wanted to see “what would happen.” Once the public got wind that their president had a penchant for the furry and feathered, stranger and stranger packages began to arrive to Pennsylvania Avenue. “He was just flooded with animals,” says David Pietrusza, historian and author of Calvin Coolidge: A Documentary Biography. Some of these gifts were simply too wild for the Oval Office, though, and had to be transferred to Smithsonian’s National Zoo. Among them were a pygmy hippo named Billy, a wallaby, and two lion cubs that Coolidge cheekily named “Tax Reduction” and “Budget Bureau.” But perhaps the oddest pet repurposing happened in November of 1926. Hoping to win Coolidge over, a cohort of well-intentioned admirers shipped him a live raccoon with the intent of having it roasted as the centerpiece of his Thanksgiving dinner. But the Coolidges, finding the raccoon sweet and friendly, couldn’t bear to see her killed—and so it was out of the frying pan and into the arms of the First Lady. Just a few short weeks later, the newest member of the Coolidge household got gussied up for Christmas, adorned in a red ribbon. Among the presented piled high by the Christmas tree was a shiny new collar, bearing the title “Rebecca Raccoon of the White House.” Like many other Coolidge pets, Rebecca was spoiled rotten. While she had likely dined in dumpsters before her relocation to Washington, D.C., Rebecca’s diet in the White House consisted of chicken, eggs, green shrimp, persimmons and cream. According to Amity Shlaes, author of Coolidge, Rebecca was often toted around in her own basket by Grace, making public appearances at summer parties and Easter egg rolls. Just as often, Rebecca could be found draped around Coolidge’s neck like a masked scarf as he went about his daily duties. Eventually Rebecca got too unruly even for the Coolidges. After she made several botched escape attempts, they reluctantly moved her to the National Zoo. Fearing she might be lonely in her new home, Coolidge and Grace even found her a male companion named Reuben—but their blind date was, alas, ill-fated, and Reuben eventually fled the zoo. Coolidge was deeply attached to his four-legged companions. When Rob Roy, a favorite collie, fell ill in 1928, Coolidge rushed him not to a veterinarian, but to Walter Reed Army Hospital for surgery. Sadly, Rob Roy didn’t survive the night. In an outpouring of emotion that, for Silent Cal, was downright “gushy” according to Pietrusza, Coolidge wrote of the dog in his autobiography: “He was a stately companion of great courage and fidelity… I know he would bark for joy as the grim boatman ferried him across the dark waters of the Styx, yet his going left me lonely on the hither shore.” In his youth, Coolidge was an enthusiastic horseman, and he carried his passion for riding to the White House—only to be pulled to a halt by Secret Service agents who insisted horseback riding was too dangerous an activity for a president. Coolidge was understandably grumpy about swearing off one of his favorite hobbies, and he complained loudly enough that New York banker Dwight W. Morrow decided to send him a mechanical hobby horse—because as luck would have it, Morrow had an extra one just lying around. Instead of looking his 475-pound gift horse in the mouth, Coolidge rigged it up and resumed riding, this time from the comfort of the White House. Thunderbolt, as the mechanical horse was nicknamed, was one of many health-conscious inventions by John Harvey Kellogg, whose legacy has filled the cereal bowls of countless early risers (Kellogg apparently also perfected a mechanical camel, which rocked side-to-side as opposed to the back-and-forth of his iron equine). While Thunderbolt was deemed a positive force on Coolidge’s liver and weight management, the horse quickly began to wound his pride: Coolidge was mocked in and out of the White House for his emasculating “hobby horse.” Eventually, whether to preserve his waning machismo or simply out of boredom, Coolidge retired Thunderbolt, who is now immortalized at the Calvin Coolidge Presidential Library and Museum in Northampton, Massachusetts. While visitors to the museum are strictly forbidden from mounting the weary steed, Pietrusza admits he may or may not have furtively snapped a photo of himself atop Thunderbolt on one of his visits. Calvin Coolidge and Grace Donahue were, by all accounts, happily married. The pairing was unlikely: stoic Coolidge courting the outgoing, vivacious Grace was a surprise to many, says Pietrusza. In their case, opposites certainly attracted. The first time Grace saw Coolidge, she glimpsed him shaving through the window of a boarding house bathroom in Northampton. Coolidge was wearing nothing but underwear and one of his signature derby hats—but rather than being appalled at the indecency, Grace just burst out laughing. “He wasn’t angry, though,” Pietrusza says. “He looked over and just thought, ‘I like her.’” On his many walks, Coolidge would frequent the storefronts of downtown Washington. While frugal in many other respects, if he saw a dress or hat he thought Grace might like, he almost always had it promptly packaged and sent to the White House. Reflecting on their marriage later in life, Coolidge once said, “We thought we were made for each other. For almost a quarter of a century she has borne with my infirmities, and I have rejoiced in her graces.” Their relationship was, of course, not without its hiccups. According to Pietrusza, Coolidge could be fiercely protective of Grace. In her most famous portrait, Grace was painted posing with the Coolidge’s collie, Rob Roy. Envisioning the portrait rendered in a patriotic color palette, Grace donned a red dress to contrast with the dog’s white coat, which the Coolidges had reportedly bleached to hide a few of Rob Roy’s off-color spots. But bright colors were considered a bold fashion statement at the time, and Coolidge wryly remarked that Grace could achieve the same striking effects by simply dyeing the dog red and wearing white instead. Coolidge died suddenly of complications from heart disease in 1933, after returning from a half-day at work. Grace was the first to find him on the floor of the bathroom in their home in Northampton, midway through shaving his face. But the person for whom Coolidge’s patience most often ran short was his son John, with whom he could be stern. In 1924, while John was attending Amherst College, Coolidge wrote a letter berating John for failing to take his studies seriously. “I want you to keep in mind that you have been sent to college to work,” Coolidge wrote. “Nothing else will do you any good. Nobody in my class who spent their time in other ways has ever amounted to anything. Unless you want to spend your time working you may just as well leave college.” Coolidge then more or less told John that, when it came to social engagements, he needed to know exactly what his son was doing and who he was doing it with. “He took the hide off John,” Pietrusza says. In 1926, Coolidge indefinitely instated a Secret Service agent as a 24/7 bodyguard for John. The Coolidges had received a series of threatening letters, so the president was likely concerned for his son’s safety—but perhaps cloaked beneath the veneer of security was Coolidge’s underlying exasperation with his son’s continued shenanigans. From that point on, the agent shared quarters with John near the Amherst campus and rarely strayed from his side. While John was able to attend classes and occasionally spend time with friends outside of his company, the agent also became a bit of a personal life coach, advising the president’s son on matters from his health to the quality of his companions. One of Coolidge’s quieter legacies was the Indian Citizenship Act of 1924, wherein all Native Americans were granted U.S. citizenship. This step by Coolidge was one of several that reflected his interest in advancing civil rights. Over the course of his tenure in the White House, he fought (unsuccessfully) to make lynching a federal crime. “He had a certain magnanimity,” says Shlaes. “He understood that [diversity] brought a lot to the table.” In 1924, an angry voter wrote Coolidge complaining that a black man was attempting to run for Congress. Coolidge was so appalled by the letter that he published his own indignant reply: “During the war 500,000 colored men and boys were called up under the draft not one of whom sought to evade it. A colored man is precisely as much entitled to submit his candidacy [as any other citizen].” Coolidge himself put it best in his address before the American Legion Convention at Omaha, Nebraska, in October of 1925: “Whether one traces his Americanism back three centuries to the Mayflower, or three years of the steerage, is not half so important as whether his Americanism of to-day is real and genuine. No matter by what various crafts we came here, we are all now in the same boat.” Katherine J. Wu is a Boston-based science journalist and Story Collider senior producer whose work has appeared in National Geographic, Undark magazine, Popular Science and more. She holds a Ph.D. in Microbiology and Immunobiology from Harvard University, and was Smithsonian magazine's 2018 AAAS Mass Media Fellow.
431c8990a18d7ed969cecf96f11f3a70
https://www.smithsonianmag.com/history/history-college-dorms-180971457/
The Evolution of the College Dorm Chronicles How Colleges Became Less White and Male
The Evolution of the College Dorm Chronicles How Colleges Became Less White and Male When art historian Carla Yanni was assistant vice president for undergraduate education at Rutgers University’s New Brunswick, New Jersey, campus, she would often hear fellow administrators mocking their midcentury predecessors for building the “River Dorms”—three modernist student residence halls overlooking the Raritan River. “As if the people who built them must have been complete idiots,” she jokes. “So I used to think, ‘Now, you are well-meaning college administrators in the present, and weren’t the people in 1955 also well-meaning college administrators, and wouldn’t we like to know how those buildings got to be there?’” Yanni’s curiosity led her to investigate the architectural history of the college dormitory, which in some ways mirrors the history of higher education itself. Her new book, Living on Campus: An Architectural History of the American Dormitory (Univ. of Minnesota Press), details the history of undergraduate college dormitories, from the first purposefully built lodgings in colonial America to dorm takeovers during the student protests of 1968. As Yanni writes, “Residence halls are not mute containers for the temporary storage of youthful bodies and emergent minds”; they reveal and “constitute historical evidence of the educational ideals of the people who built them.” At a time when college marketing departments try to attract students by highlighting luxurious dorms as much as small class sizes or winning sports teams, it can be instructive to look back on this staple of the American undergraduate experience. The first US colleges were sponsored by Protestant denominations and tended to be isolated, in rural locations or small towns, to distance students from the corrupting influence of the city. Isolation, Yanni says, allowed an institution to “imprint its specific morality upon its followers.” Dormitories were necessary when local rooming houses lacked enough berths for students, but they also fit with the missionary spirit of these early institutions. Administrators emphasized the need for a moral education as well as an academic one, so the undergraduate experience took on a semi-monastic aura. Harvard University was at first a single, multipurpose building, with classrooms right next to sleeping rooms, on the outskirts of the newly founded town of Boston. As Yanni discovered, Harvard’s first governing board thought this provided “an advantage to Learning” because “the multitude of persons cohabiting for scholasticall communion” away from the rest of the world would serve to create America’s first crop of Puritan ministers. Harvard University was at first a single, multipurpose building, with classrooms right next to sleeping rooms. But that’s not to say that students agreed with the earliest stated purposes of dormitories. Benjamin Franklin, for example, was less interested in the moral or pedagogical benefits of his collegiate years than in socializing with other members of his class in order to find friends, business partners, and future brothers-in-law. In the absence of dormitories and sufficient rooms in private houses, students often took it upon themselves to create their own communal spaces: the first “purpose-built” fraternity houses. The first, the Zeta Psi house at the University of California, Berkeley, was merely a structure funded by alumni in the 1870s. As fraternities grew along with colleges, Yanni writes, each alumni group wanted “its younger brothers to occupy a house that was an ‘architectural ornament’—a sign of the fraternity’s wealth and a demonstration of the brothers’ contribution to the college.” This gave rise to the fraternity mansion, a design exemplified by the late 19th-century Psi Upsilon House at the University of Michigan. This new emphasis on wealth meant that fraternities were often expensive and exclusive. But exclusivity was, in many ways, already built into the American collegiate experience. Dorms initially were almost always segregated. “College life introduced men to other men like themselves,” Yanni writes. But when men who weren’t members of the white Protestant elite became students, most institutions shunted them into separate housing. The Harvard Indian College, for example, was built in 1655 so that white students wouldn’t have to live with Native students. This trend continued when white women of the middle and upper classes began to earn undergraduate degrees, in the mid-19th century, at both private women’s colleges and large land-grant universities. These students were expected to become homemakers, wives, and mothers, so their dormitories reflected the ideology of domesticity: they were not boarding houses but “cottages” to fit them for the roles they were expected to fulfill. This reflected a common educational philosophy of the time: as Charles F. Thwing, president of Western Reserve University in Cleveland, said in 1895, “all that learning and culture can offer” to women is “for the betterment of the home,” and to create fit helpmeets for male undergraduates. Yanni says that she was surprised to discover that this idea had influenced the plans for buildings constructed as late as the 1910s. When she was doing archival research about the University of Michigan’s Martha Cook Building, she discovered that “the donor wanted the women’s dormitory to civilize the young men.” He thought that young women’s university experience should be like a “charm school,” says Yanni, and the architecture of the dorms “perfectly aligns” with that goal. The multitude of lavishly decorated reception rooms and the large dining hall taking up the first floor of the dormitory suggest that once women came down from their rooms, they were to focus on socializing, rather than studying. Yanni ran into difficulties when researching what the expansion of American higher education to people of color meant for college residential living. “It’s very easy to find out who the first African American graduate of a university was,” she says, “but it’s many days in the archives if you [want to] find out if that person was allowed to live on campus.” Some of the earliest archival traces on the subject come from white students, alumni, faculty, and administrators objecting to having students of color living in residential facilities. Yanni points to the work of education historian Cally L. Waite on Oberlin College, which was founded in 1833 and admitted black students beginning two years later. By looking at community and student newspapers from the 19th century, Waite showed that African American and white students were living together in Oberlin dorms: in the 1880s, a long controversy erupted when a white matron, at the behest of white students, pushed their black classmates to a separate table at Ladies Hall, a women’s dorm. Throughout Living on Campus, Yanni engages with the concept of “environmental determinism”—a Victorian, quasi-utopian idea that environment shapes personal character, and that purposefully built, orderly buildings are essential to molding, in this case, undergraduate students into ideal citizens. By the 1920s and ’30s, dormitories had become crucibles in which deans and other university administrators, acting in loco parentis, transfigured children into adults. Administrators, writes Yanni, came to see dormitories as “an integral part of the educational pathway.” All students would, ideally, live on campus to get the full benefit of the collegiate experience. But thanks to the GI Bill after World War II, a new influx of students challenged this emphasis on campus living; there simply wasn’t enough space to house all of them. This led to the growth of the type of modernist high-rises that Yanni’s colleagues so lamented. These cookie-cutter dorms were relatively quick and inexpensive to build. As Yanni writes, however, “modernist architecture was, by its very nature, rigid and repetitive” and quickly “became a metaphor for the misery that dorm dwellers felt about their lives as subjugated students.” These residence halls made students feel anonymous, more products than people, a feeling at odds with what Yanni terms “the calls for radical change being heard in the 1960s.” “Students rejected in loco parentis,” writes Yanni. “They did not need caretaking. They were adults who wanted to be treated as such.” This radicalism manifested itself in the students’ living arrangements: integrated dormitories and projects like Kresge College at the University of California, Santa Cruz, which tried to imitate the “urbanism of an Italian hill town.” The residential area, built around the site’s redwoods, included not just dormitories but cafés, launderettes, meeting spaces, and classrooms in what were termed “living-learning units.” Some of these units had no interior walls at all, as residents themselves were supposed to divide up the space based on communal agreement. Despite these radical building plans of the 1960s, Yanni observes, the dorms of today still mimic many of the same core features of dorms of the past. Today’s students come from increasingly diverse ethnic and socio-economic backgrounds, but like their early forebears, they often share rooms along long corridors, in buildings that house many of their fellow students. The college dorm still acts as a space to transition into adulthood. This offers an explanation for a recent trend in student affairs: the construction and promotion of what Yanni terms “ever more elaborate residence halls, some of which resemble five-star hotels,” in an “amenities arms race.” Americans have come to accept dormitories as an essential and integral part of the undergraduate experience, one that should help students achieve academic excellence and fulfill their demands for apartment-like and therefore independent adult living, while also providing opportunities for meaningful interaction. “Dormitories are a measure of the fact that Americans value higher education for networking as much as for higher education,” says Yanni. Like all buildings, she adds, college residences also “carry the weight of social values, because unlike writing a poem or even painting a painting, it requires an enormous amount of capital to build a building.” Or, to put it another way, dormitories “don’t just happen.” Remember that on your next stroll across campus. Elyse Martin is associate editor, web content and social media, at the American Historical Association. This article was originally published at the American Historical Association's Perspectives on History.
4613e966416373ea8506b42993d544ff
https://www.smithsonianmag.com/history/history-creepy-dolls-180955916/?utm_content=buffer5b67b&no-ist
The History of Creepy Dolls
The History of Creepy Dolls Pollock’s Toy Museum is one of London’s loveliest small museums, a creaking Dickensian warren of wooden floors, low ceilings, threadbare carpets, and steep, winding stairs, housed in two connected townhouses. Its small rooms house a large, haphazard collection of antique and vintage toys – tin cars and trains; board games from the 1920s; figures of animals and people in wood, plastic, lead; paint-chipped and faintly dangerous-looking rocking horses; stuffed teddy bears from the early 20th century; even – purportedly – a 4,000 year old mouse fashioned from Nile clay. And dolls. Dolls with “sleepy eyes”, with staring, glass eyes. Dolls with porcelain faces, with “true-to-life” painted ragdoll faces, with mops of real hair atop their heads, with no hair at all. One-hundred-and-fifty-year-old Victorian dolls, rare dolls with wax faces. Dolls with cheery countenances, dolls with stern expressions. Sweet dolls and vaguely sinister dolls. Skinny Dutch wooden dolls from the end of the 19th century, dolls in “traditional” Japanese or Chinese dress. One glassed-off nook of a room is crammed with porcelain-faced dolls in 19th-century clothing, sitting in vintage model carriages and propped up in wrought iron bedsteads, as if in a miniaturized, overcrowded Victorian orphanage. Some visitors to the museum, however, can’t manage the doll room, which is the last room before the museum’s exit; instead, they trek all the way back to the museum’s entrance, rather than go through. “It just freaks them out,” says Ken Hoyt, who has worked at the museum for more than seven years. He says it’s usually adults, not children, who can’t handle the dolls. And it happens more often during the winter, when the sun goes down early and the rooms are a bit darker. “It’s like you’d think they’ve gone through a haunted house… It’s not a great way to end their visit to the Pollock’s Toy Museum,” he says, laughing, “because anything else that they would have seen that would have been charming and wonderful is totally gone now.” A fear of dolls does have a proper name, pediophobia, classified under the broader fear of humanoid figures (automatonophobia) and related to pupaphobia, a fear of puppets. But most of the people made uncomfortable by the doll room at Pollock’s Toy Museum probably don’t suffer from pediophobia so much as an easy-to-laugh-off, often culturally reinforced, unease. “I think people just dismiss them, ‘Oh, I’m scared of dolls’, almost humorously – ‘I can’t look at those, I hate them,’ laughingly, jokingly. Most people come down laughing and saying, ‘I hated that last room, that was terrible,’” Hoyt says. Dolls – and it must be said, not all dolls – don’t really frighten people so much as they “creep” them out. And that is a different emotional state all together. SEE ALSO: Read about the history and psychology of scary clowns Dolls have been a part of human play for thousands of years – in 2004, a 4,000-year-old stone doll was unearthed in an archeological dig on the Mediterranean island of Pantelleria; the British Museum has several examples of ancient Egyptian rag dolls, made of papyrus-stuffed linen. Over millennia, toy dolls crossed continents and social strata, were made from sticks and rags, porcelain and vinyl, and have been found in the hands of children everywhere. And by virtue of the fact that dolls are people in miniature, unanimated by their own emotions, it’s easy for a society to project whatever it wanted on to them: Just as much as they could be made out of anything, they could be made into anything. “I think there is quite a tradition of using dolls to reflect cultural values and how we see children or who we wish them to be,” says Patricia Hogan, curator at The Strong National Museum of Play in Rochester, New York, and associate editor of the American Journal of Play. For example, she says, by the end of the 19th century, many parents no longer saw their children as unfinished adults, but rather regarded childhood as a time of innocence that ought to be protected. In turn, dolls’ faces took on a more cherubic, angelic look. Dolls also have an instructional function, often reinforcing gender norms and social behavior: Through the 18th and 19th century, dressing up dolls gave little girls the opportunity to learn to sew or knit; Hogan says girls also used to act out social interactions with their dolls, not only the classic tea parties, but also more complicated social rituals such as funerals as well. In the early 20th century, right around the time that women were increasingly leaving the home and entering the workplace, infant dolls became more popular, inducting young girls into a cult of maternal domesticity. In the second half of the 20th century, Barbie and her myriad career (and sartorial) options provided girls with alternative aspirations, while action figures offered boys a socially acceptable way to play with dolls. The recent glut of boy-crazy, bizarrely proportioned, hyper-consumerist girl dolls (think Bratz, Monster High) says something about both how society sees girls and how girls see themselves, although what is for another discussion. So dolls, without meaning to, mean a lot. But one of the more relatively recent ways we relate to dolls is as strange objects of – and this is a totally scientific term – creepiness. Research into why we think things are creepy and what potential use that might have is somewhat limited, but it does exist (“creepy”, in the modern sense of the word, has been around since the middle of the 19th century; its first appearance in The New York Times was in an 1877 reference to a story about a ghost). In 2013, Frank McAndrew, a psychologist at Knox College in Illinois, and Sara Koehnke, a graduate student, put out a small paper on their working hypothesis about what “creepiness” means; the paper was based on the results of a survey of more than 1,300 people investigating what “creeped” them out (collecting dolls was named as one of the creepiest hobbies). Creepiness, McAndrew says, comes down to uncertainty. “You’re getting mixed messages. If something is clearly frightening, you scream, you run away. If something is disgusting, you know how to act,” he explains. “But if something is creepy… it might be dangerous but you’re not sure it is… there’s an ambivalence.” If someone is acting outside of accepted social norms – standing too close, or staring, say – we become suspicious of their intentions. But in the absence of real evidence of a threat, we wait and in the meantime, call them creepy. The upshot, McAndrew says, is that being in a state of “creeped out” makes you “hyper-vigilant”. “It really focuses your attention and helps you process any relevant information to help you decide whether there is something to be afraid of or not. I really think creepiness is where we respond in situations where we don’t know have enough information to respond, but we have enough to put us on our guard.” Human survival over countless generations depended on the avoidance of threats; at the same time, humans thrived in groups. The creeped out response, McAndrew theorized, is shaped by the twin forces of being attuned to potential threats, and therefore out-of-the-ordinary behavior, and of being wary of rocking the social boat. “From an evolutionary perspective, people who responded with this creeped out response did better in the long run. People who didn’t might have ignored dangerous things, or they’re more likely to jump to the wrong conclusion too quickly and be socially ostracized,” he explains. Dolls inhabit this area of uncertainty largely because they look human but we know they are not. Our brains are designed to read faces for important information about intentions, emotions and potential threats; indeed, we’re so primed to see faces and respond to them that we see them everywhere, in streaked windows and smears of Marmite, toast and banana peels, a phenomenon under the catchall term “pareidolia” (try not to see the faces in this I See Faces Instagram feed). However much we know that a doll is (likely) not a threat, seeing a face that looks human but isn’t unsettles our most basic human instincts. “We shouldn’t be afraid of a little piece of plastic, but it’s sending out social signals,” says McAndrew, noting too that depending on the doll, these signals could just as easily trigger a positive response, such as protectiveness. “They look like people but aren’t people, so we don’t know how to respond to it, just like we don’t know how to respond when we don’t know whether there is a danger or not... the world in which we evolved how we process information, there weren’t things like dolls.” Some researchers also believe that a level of mimicry of nonverbal cues, such as hand movements or body language, is fundamental to smooth human interaction. The key is that it has to be the right level of mimicry – too much or too little and we get creeped out. In a study published in Psychological Science in 2012, researchers from the University of Groningen in the Netherlands found that inappropriate nonverbal mimicry produced a physical response in the creeped out subject: They felt chills. Dolls don’t have the ability to mimic (although they do seem to have the ability to make eye contact), but because at least part some part of our brain is suspicious about whether this is a human or not, we may expect them to, further confusing things. You can’t talk about creepy dolls without invoking the “uncanny valley”, the unsettling place where creepy dolls, like their robot cousins, and before them, the automatons, reside. The uncanny valley refers to the idea that human react favorably to humanoid figures until a point at which these figures become too human. At that point, the small differences between the human and the inhuman – maybe an awkward gait, an inability to use appropriate eye contact or speech patterns – become amplified to the point of discomfort, unease, disgust, and terror. The idea originated with Japanese roboticist Masahiro Mori’s 1970 essay anticipating the challenges robot-makers would face. Although the title of the paper, “Bukimi No Tani”, is actually more closely translated as “valley of eeriness”, the word “uncanny” hearkens back to a concept that psychiatrist Ernst Jentsch explored in 1906 and that Sigmund Freud described in a 1919 paper, “The Uncanny”. Though the two differed in their interpretations – Freud’s was, unsurprisingly, Freudian: the uncanny recalls our repressed fears and anti-social desires – the basic idea was that the familiar is somehow rendered strange, and that discomfort is rooted in uncertainty. But the uncanny valley is, for scientists and psychologists alike, a woolly area. Given the resources being poured into robotics, there’s been more research into whether or not the uncanny valley is real, if it’s even a valley and not a cliff, and where exactly it resides. Thus far, results aren’t conclusive; some studies suggest that the uncanny valley doesn’t exist, some reinforce the notion that people are unsettled by inhuman objects that look and act too human. These studies are likely complicated by the fact that widespread exposure to more “natural” looking humanoid figures is on the rise through animated films and video games. Maybe like the Supreme Court standard for obscenity, we know uncanny, creepy humanoids when we see them? But before the 18th and 19th centuries, dolls weren’t real enough to be threatening. Only when they began to look too human, did dolls start to become creepy, uncanny, and psychology began investigating. “Doll manufacturers figured out how to better manipulate materials to make dolls look more lifelike or to develop mechanisms that make them appear to behave in ways that humans behave,” says Hogan, pointing to the “sleep eye” innovation in the early 1900s, where the doll would close her eyes when laid horizontal in exactly the way real children don’t (that would be too easy for parents). “I think that’s where the unease comes with dolls, they look like humans and in some ways move like humans and the more convincing they look or move or look like humans, the more uneasy we become.” At Pollock’s, the dolls that people find particularly creepy are the ones that look more lifelike, says Hoyt; these are also the ones that have begun to decay in eerily inhuman ways. “The dolls don’t age well.… I think any time that a doll really tried to look like a human being and now is 100 years old, the hair is decaying, the eyes don’t work any more. So it looks as much like a baby as possible, but like an ancient baby,” Hoyt says. Which presents an interesting phenomenon: The creepiness of realistic dolls is complicated by the fact that some people want dolls (and robots) that look as lifelike as possible. Reborns are a good illustration of the problem; hyper-realistic, these are custom-crafted infant dolls that, reborn artists and makers say, “you can love forever”. The more lifelike an infant doll is – and some of them even boast heartbeats, breathing motion, and cooing – the more desirable it is among reborn devotees, but equally, the more it seems to repulse the general public. Perhaps it comes down to what we can make dolls into. In A.F. Robertson’s 2004 investigation into doll-collecting, Life Like Dolls: The Collector Doll Phenomenon and the Lives of the Women Who Love Them, some of the women who collected porcelain dolls thought of their dolls as alive, as sentient beings with feelings and emotions; these women who referred to their doll collections as “nurseries” were sometimes “shunned” by other antique doll collectors who did not have the relationship to their own dolls. Women – and it is almost exclusively women – who collect reborns often treat them as they would real babies; some psychologists have talked about “reborns” as “transition objects” for people dealing with loss or anxiety. Freud may have argued that all children wish their dolls could come to life, but even so, it’s not socially acceptable for adults to entertain the same desire. If we are creeped out by inanimate things that aren’t human looking too human, we may also be creeped out by adult humans pretending that these inanimate things are real. “We’re creeped out by people who have these kinds of hobbies and occupations because right away, we jump to the conclusion, ‘What kind of person would willingly surround themselves with… humanlike things that are not human?’” says McAndrew, who also noted that he and Koehnke’s survey on creepiness found that most people think that creepy people don’t realize they’re creepy. “We’re on our guard to those types of people because they’re out of the ordinary.” It’s also exactly the kind of thing easy to exploit in media. Some doll makers blame Hollywood films for the creepy doll stigma, and there’s no doubt that moviemakers have used dolls to great effect. But the doll was creepy well before Hollywood came calling. In the 18th and 19th centuries, as dolls became more realistic and as their brethren, the automata, performed more dexterous feats, artists and writers began exploring the horror of that almost immediately. The tales of German writer E.T.A Hoffman are widely seen as the beginning of the creepy automaton/doll genre; Jentsch and Freud used Hoffman’s “The Sandman,” as a case study in the uncanny. The story, published in 1816, involves a traumatized young man who discovers that the object of his affection is in fact a clever wind-up doll, the work of a sinister alchemist who may or may not have murdered the young man’s father; it drives him mad. The horror in this story turned on the deceptive attractiveness of the girl, rather than any innate murderousness in her; for the 19th century, creepy dolls stories tended to be about the malevolence of the maker than the doll itself. In the 20th century, creepy dolls became more actively homicidal, as motion picture technology transformed the safely inanimate into the dangerously animate. Some evil dolls still had an evil human behind them: Dracula director Tod Browning’s 1936 The Devil-Doll featured Lionel Barrymore as man wrongly convicted of murder who turns two living humans into doll-sized assassins to wreak his revenge on the men who framed him. But then there was The Twilight Zone’s murderous Talky Tina, inspired by one of the most popular and influential dolls of the 20th century, Chatty Cathy – “My name is Talky Tina and you’d better be nice to me!”; the evil clown doll from Poltergeist, cannily marrying two creepy memes for maximum terror; and of course, Chucky, the My Buddy clone possessed by the soul of a serial killer in the Child’s Play series. The 1980s and 1990s saw dozens of B-movie variations on the homicidal doll theme: Dolly Dearest, Demonic Toys, Blood Dolls. In 2005, the evil denizens of the Doll Graveyard came back for teenaged souls (and eyeballs, it appears); in 2007, homicidal ventriloquist dummies were going around ripping people’s tongues out in Dead Silence. Most recently, devil worshippers inadvertently turned a smiling vintage doll into a grinning demon in last October’s Annabelle, a film in the Conjuring franchise. Director John Leonetti, who did not return requests for comment, told The Huffington Post that dolls made exceptional vehicles for horror films. “If you think about them, most dolls are emulating a human figure,” said Leonetti. “But they’re missing one big thing, which is emotion. So they’re shells. It’s a natural psychological and justifiable vehicle for demons to take it over. If you look at a doll in its eyes, it just stares. That’s creepy. They’re hollow inside. That space needs to be filled.” With evil. But the story of Annabelle the demonic doll, however, becomes far creepier – and more titillating – when it’s accompanied by the claim that it’s “based on a true story”. Paranormal investigators Ed and Lorraine Warren claimed that Annabelle the Raggedy Ann doll, whose original owners frequently found her in places they hadn’t left her, was being used by a demonic spirit in its quest to possess a human soul; she now lives in a specially-made demon-proof case marked “Warning: Positively Do Not Open” at the Warren’s Occult Museum in Connecticut. Annabelle is not the only evil doll the museum alleges it houses, and there are many more such purportedly real-life possessed dolls around the world; as NPR reported in March, “Haunted dolls are a thing”. Robert the Doll, the lifelong companion of an eccentric Key West artist, glowers at people from the East Martello Museum, where he’s become a tiny, haunted cottage industry unto himself; you can even buy your own replica Robert doll to blame things on. If you are unable to visit a haunted or possessed doll in the flesh (or porcelain, as the case may be), then you can always watch a live feed of this rural Pennsylvania family’s haunted doll collection. These stories, like the stories of real live clowns who murdered, feed into a narrative that makes dolls scary. John has found the perfect gift for his wife, Mia: a beautiful, rare vintage doll. But Mia's delight with Annabelle the doll doesn't last long. It doesn’t appear that the creepy stigma increasingly attached to dolls, nor the bevy of scary doll films, has done anything to really harm sales of dolls in the US. While sales of dolls in 2014 were lower than they had been 10 years earlier, the figures were still in the billions of dollars – $2.32 billion to be exact, outstripping sales of vehicular toys, action figures, arts and crafts, and plush toys, and second only to outdoors and sports toys sales. it hasn’t put a damper on the secondhand and collectible doll market, where handmade porcelain dolls regularly fetch in the thousands of dollars. In September 2014, a rare Kämmer & Reinhardt doll from the early 1900s was auctioned off for an unbelievable £242,500 ($395,750); the report suggested the buyer not see Annabelle, which was due to be released soon after. The creepiness of dolls sometimes adds to their appeal; some doll makers are actively courting creepy, such as this reborn artist who sells “monster” babies alongside regular babies, or the popular and scary Living Dead Dolls line. Because the fact is, people like creepy. The same mechanism that makes us hyper-vigilant also keeps us interested: “We’re fascinated and enthralled and little on edge because we don’t know what comes next, but we’re not in any way paralyzed by it,” muses Hogan. “We’re more drawn into it, which I think it’s that drawing in or almost being the under spell of wanting to find out what comes next is what good storytellers exploit.” And, maybe, good doll makers, too? Linda Rodriguez McRobbie is an American freelance writer living in London, England. She covers the weird stuff for Smithsonian.com, Boing Boing, Slate, mental_floss, and others, and she's the author of Princesses Behaving Badly.
78714f7ba3532814551d93f48ae4469b
https://www.smithsonianmag.com/history/history-five-uniquely-american-sandwiches-180967078/
The History of Five Uniquely American Sandwiches
The History of Five Uniquely American Sandwiches Everyone has a favorite sandwich, often prepared to an exacting degree of specification: Turkey or ham? Grilled or toasted? Mayo or mustard? White or whole wheat? We reached out to five food historians and asked them to tell the story of a sandwich of their choosing. The responses included staples like peanut butter and jelly, as well as regional fare like New England’s chow mein sandwich. Together, they show how the sandwiches we eat (or used to eat) do more than fill us up during our lunch breaks. In their stories are themes of immigration and globalization, of class and gender, and of resourcefulness and creativity. A taste of home for working women (Megan Elias, Boston University) The tuna salad sandwich originated from an impulse to conserve, only to become a symbol of excess. In the 19th century – before the era of supermarkets and cheap groceries – most Americans avoided wasting food. Scraps of chicken, ham or fish from supper would be mixed with mayonnaise and served on lettuce for lunch. Leftovers of celery, pickles and olives – served as supper “relishes” – would also be folded into the mix. The versions of these salads that incorporated fish tended to use salmon, white fish or trout. Most Americans didn’t cook (or even know of) tuna. Around the end of the 19th century, middle-class women began to spend more time in public, patronizing department stores, lectures and museums. Since social conventions kept these women out of the saloons where men ate, lunch restaurants opened up to cater to this new clientele. They offered women exactly the kind of foods they had served each other at home: salads. While salads made at home often were composed of leftovers, those at lunch restaurants were made from scratch. Fish and shellfish salads were typical fare. When further social and economic changes brought women into the public as office and department store workers, they found fish salads waiting for them at the affordable lunch counters patronized by busy urban workers. Unlike the ladies’ lunch, the office lunch hour had time limits. So lunch counters came up with the idea of offering the salads between two pieces of bread, which sped up table turnover and encouraged patrons to get lunch to go. When canned tuna was introduced in the early 20th century, lunch counters and home cooks could skip the step of cooking a fish and go straight to the salad. But there was downside: The immense popularity of canned tuna led to the growth of a global industry that has severely depleted stocks and led to the unintended slaughter of millions of dolphins. A clever way to use dinner scraps has become a global crisis of conscience and capitalism. I like mine on toasted rye. East meets West in Fall River, Massachusetts (Imogene Lim, Vancouver Island University) “Gonna get a big dish of beef chow mein,” Warren Zevon sings in his 1978 hit “Werewolves of London,” a nod to the popular Chinese stir-fried noodle dish. During that same decade, Alika and the Happy Samoans, the house band for a Chinese restaurant in Fall River, Massachusetts, also paid tribute to chow mein with a song titled “Chow Mein Sandwich.” Chow mein in a sandwich? Is that a real thing? I was first introduced to the chow mein sandwich while completing my doctorate at Brown University. Even as the child of a Chinatown restaurateur from Vancouver, I viewed the sandwich as something of a mystery. It led to a post-doctoral fellowship and a paper about Chinese entrepreneurship in New England. The chow mein sandwich is the quintessential “East meets West” food, and it’s largely associated with New England’s Chinese restaurants – specifically, those of Fall River, a city crowded with textile mills near the Rhode Island border. The sandwich became popular in the 1920s because it was filling and cheap: Workers munched on them in factory canteens, while their kids ate them for lunch in the parish schools, especially on meatless Fridays. It would go on to be available at some “five and dime” lunch counters, like Kresge’s and Woolworth – and even at Nathan’s in Coney Island. It’s exactly what it sounds like: a sandwich filled with chow mein (deep-fried, flat noodles, topped with a ladle of brown gravy, onions, celery and bean sprouts). If you want to make your own authentic sandwich at home, I recommend using Hoo Mee Chow Mein Mix, which is still made in Fall River. It can be served in a bun (à la sloppy joe) or between sliced white bread, much like a hot turkey sandwich with gravy. The classic meal includes the sandwich, french fries and orange soda. For those who grew up in the Fall River area, the chow mein sandwich is a reminder of home. Just ask famous chef (and Fall River native) Emeril Legassé, who came up with his own “Fall River chow mein” recipe. And at one time, Fall River expats living in Los Angeles would hold a “Fall River Day.” On the menu? Chow mein sandwiches, of course. A snack for the elites (Paul Freedman, Yale University) Unlike many American food trends of the 1890s, such as the Waldorf salad and chafing dishes, the club sandwich has endured, immune to obsolescence. The sandwich originated in the country’s stuffy gentlemen’s clubs, which are known – to this day – for a conservatism that includes loyalty to outdated cuisine. (The Wilmington Club in Delaware continues to serve terrapin, while the Philadelphia Club’s specialties include veal and ham pie.) So the club sandwich’s spread to the rest of the population, along with its lasting popularity, is a testament to its inventiveness and appeal. A two-layer affair, the club sandwich calls for three pieces of toasted bread spread with mayonnaise and filled with chicken or turkey, bacon, lettuce and tomato. Usually the sandwich is cut into two triangles and held together with a toothpick stuck in each half. Some believe it should be eaten with a fork and knife, and its blend of elegance and blandness make the club sandwich a permanent feature of country and city club cuisine. As far back as 1889, there are references to a Union Club sandwich of turkey or ham on toast. The Saratoga Club-House offered a club sandwich on its menu beginning in 1894. Interestingly, until the 1920s, sandwiches were identified with ladies’ lunch places that served “dainty” food. The first club sandwich recipe comes from an 1899 book of “salads, sandwiches and chafing-dish dainties,” and its most famous proponent was Wallis Simpson, the American woman whom Edward VIII abdicated the throne of Great Britain to marry. Nonetheless, an 1889 article from the New York Sun entitled “An Appetizing Sandwich: A Dainty Treat That Has Made a New York Chef Popular” describes the Union Club sandwich as appropriate for a post-theater supper, or something light to be eaten before a nightcap. This was one type of sandwich that men could indulge in, the article seemed to be saying – as long as it wasn’t eaten for lunch. ‘The combination is delicious and original’ (Ken Albala, University of the Pacific) While the peanut butter and jelly sandwich eventually became a staple of elementary school cafeterias, it actually has upper-crust origins. In the late-19th century, at elegant ladies’ luncheons, a popular snack was small, crustless tea sandwiches with butter and cucumber, cold cuts or cheese. Around this time, health food advocates like John Harvey Kellogg started promoting peanut products as a replacement for animal-based foods (butter included). So for a vegetarian option at these luncheons, peanut butter simply replaced regular butter. One of the earliest known recipes that suggested including jelly with peanut butter appeared in a 1901 issue of the Boston Cooking School Magazine. “For variety,” author Julia Davis Chandler wrote, “some day try making little sandwiches, or bread fingers, of three very thin layers of bread and two of filling, one of peanut paste, whatever brand you prefer, and currant or crabapple jelly for the other. The combination is delicious, and so far as I know original.” The sandwich moved from garden parties to lunchboxes in the 1920s, when peanut butter started to be mass produced with hydrogenated vegetable oil and sugar. Marketers of the Skippy brand targeted children as a potential new audience, and thus the association with school lunches was forged. The classic version of the sandwich is made with soft, sliced white bread, creamy or chunky peanut butter and jelly. Outside of the United States, the peanut butter and jelly sandwich is rare  – much of the world views the combination as repulsive. These days, many try to avoid white bread and hydrogenated fats. Nonetheless, the sandwich has a nostalgic appeal for many Americans, and recipes for high-end versions – with freshly ground peanuts, artisanal bread or unusual jams – now circulate on the web. The Daughters of the Confederacy get creative (Andrew P. Haley, University of Southern Mississippi) The Scotch woodcock is probably not Scottish. It’s arguably not even a sandwich. A favorite of Oxford students and members of Parliament until the mid-20th century, the dish is generally prepared by layering anchovy paste and eggs on toast. Like its cheesier cousin, the Welsh rabbit (better known as rarebit), its name is fanciful. Perhaps there was something about the name, if not the ingredients, that sparked the imagination of Miss Frances Lusk of Jackson, Mississippi. Inspired to add a little British sophistication to her entertaining, she crafted her own version of the Scotch woodcock for a 1911 United Daughters of the Confederacy fundraising cookbook. Miss Lusk’s woodcock sandwich mixed strained tomatoes and melted cheese, added raw eggs, and slathered the paste between layers of bread (or biscuits). As food historian Bee Wilson argues in her history of the sandwich, American sandwiches distinguished themselves from their British counterparts by the scale of their ambition. Imitating the rising skylines of American cities, many were towering affairs that celebrated abundance. But those sandwiches were the sandwiches of urban lunchrooms and, later, diners. In the homes of southern clubwomen, the sandwich was a way to marry British sophistication to American creativity. For example, the United Daughters of the Confederacy cookbook included “sweetbread sandwiches,” made by heating canned offal (animal trimmings) and slathering the mashed mixture between two pieces of toast. There’s also a “green pepper sandwich,” crafted from “very thin” slices of bread and “very thin” slices of green pepper. Such creative combinations weren’t limited to the elites of Mississippi’s capital city. In the plantation homes of the Mississippi Delta, members of the Coahoma Woman’s Club served sandwiches of English walnuts, black walnuts and stuffed olives ground into a colorful paste. They also assembled “Friendship Sandwiches” from grated cucumbers, onions, celery and green peppers mixed with cottage cheese and mayonnaise. Meanwhile, the industrial elite of Laurel, Mississippi, served mashed bacon and eggs sandwiches and creamed sardine sandwiches. Not all of these amalgamations were capped by a slice of bread, so purists might balk at calling them sandwiches. But these ladies did – and they proudly tied up their original creations with ribbons. Paul Freedman, Chester D. Tripp Professor of History, Yale University Andrew P. Haley, Associate Professor of American Cultural History, The University of Southern Mississippi Imogene L. Lim, Professor of Anthropology, Vancouver Island University Ken Albala, Professor of History, Director of Food Studies, University of the Pacific Megan Elias, Associate Professor of the Practice of Gastronomy, Boston University
ea86a1bd62fac8dc731b7fb2a02a2211
https://www.smithsonianmag.com/history/history-humble-suitcase-180951376/
The History of the Humble Suitcase
The History of the Humble Suitcase When Phileas Fogg decides to circle the globe in Around the World in 80 Days, the 1873 novel by Jules Verne, he doesn't take a suitcase. “We'll have no trunks,” he says to his servant Passepartout, “only a carpet bag, with two shirts and three pairs of stockings for me, and the same for you. We'll buy our clothes on the way.” At the time, the suitcase as we know it today hardly existed. In Verne's day, proper travel required a hefty trunk built of wood, leather, and often a heavy iron base. The best trunks were waterproofed with canvas or tree sap, as steamships were a reigning mode of travel. Without this protection, a suitcase in the hold of a heaving, leaky ship would probably have been wet within a few hours, and crushed by sliding trunks within a few more. When the suitcase finally did catch on at the end of the 19th century, it was quite literally a case for suits. A typical suitcase came equipped with an inner sleeve for storing shirts, and sometimes a little hat box on the side. But even in the early 20th century, the “dress-suit case” was only one of countless styles of container that travelers could buy, from steamer trunks to club bags to Eveready portable wardrobes. These were boom times for the baggage business. Which, of course, probably seems like an utterly useless fact. Most people care about containers much less than they care about the things containers contain—the pairs of pants, the paperback books, the miniature bottles of shampoo. But the history of the suitcase spans every major transportation revolution since the steamship. And this means that suitcases carry a lot more than spare socks and underwear—they carry in their design a subtle history of human movement. It's a good thing Phileas Fogg didn't take a trunk, because dragging one from steamship to railroad to carriage to hot air balloon would have ruined his rapid pace. Trunk-laden travel was becoming increasingly illogical as long-distance transportation grew more common and diverse. Up to that point, tourism had begun a decidedly upper-class phenomenon, and the rich could rely on an army of hired hands to carry luggage. In the 18th century, young European elites on the Grand Tour had often traveled with several servants in a coach filled with trunks and furniture. There wasn't sufficient incentive to revise an inconvenient design while rich travelers simply relied on railway porters and hotel bellhops. (Indeed, when Fogg meets an Indian princess along the way, he buys luggage for her, and the pair is soon carried to their steamship by palanquin—basically a chair with handles that's lifted with human labor—with “their luggage brought up after on a wheelbarrow.”) But the late 19th century marked a pivot point in the history of transportation: it was the beginning of mass tourism, of travel for travel's sake (as opposed to, say,  pilgrimages to Jerusalem or migration to industrial mill towns.) Humans had long traveled for the sake of curiosity and exploration, of course, but by 1900 or so, hotels in Switzerland were recording millions of overnight stays per year, and a summer day could draw hundreds of thousands of visitors to British beaches. Travel wasn't just for the wealthy anymore. Suitcases began as an afterthought in the luggage and leather goods business, but they soon became the very symbol of travel. An 1897 wholesale price list included the words “suit case” only twice in a 20-page list of luggage types. In a 1907 T. Eaton & Co. catalog, trunks took up a full page while suitcases share a page with club bags and valises. In a 1911 United Company catalog, however, around 40 percent of the advertisements were for suitcases. (It's worth pointing out that these catalogs were from North America, where migration required people—and not just the wealthy—to carry their own belongings far and often). Early suitcases (usually called “suit cases” or “suit-cases”) were lighter and more portable than trunks, but they were still bulky by today's standards. Leather, wicker or thick rubbery cloth was stretched over a rigid wood or steel frame. Corners were rounded out using brass or leather caps. Such suitcases tended to have roughly the proportions of a hardback book: flattened and easy to carry, with a handle on the long side. Until steamship travel declined during the mid-20th century, many were advertised as waterproof. Lightweight models were often marketed specifically to women. As trunks went out of style, suitcases took on not just practical but also cultural significance. By the 1920s, suitcases featured in books such as The Hardy Boys and such films as The Woman in the Suitcase, as a literary symbol for both mobility and mystery—perhaps filled with gold, photographs, or simply a stranger's possessions. During the Great Depression, farmers who worked fields away from home were called “suitcase farmers.” Suitcases still had a ways to go before achieving their present form, though. With the rapid expansion in automobile travel during in the 1920s, and a more gradual expansion of air travel a couple decades later, suitcases found new applications but also new kinds of competition. A 1933 business report written to President Franklin Roosevelt by Hugh S. Johnson, an administrator in the National Recovery Administration, put it this way: “With the increase in the use of automobiles, it has become easy to utilize simple cardboard containers secured at little or no cost, in the back of the automobile in lieu of luggage.” Suitcases, in other words, had to get lighter and cheaper if they wanted to compete. The robust wood, steel, and heavy leather suitcase gave way to cardboard and plastic models that emphasized “modern” materials and convenience. Think back now to the suitcases you can buy today. Many feature large pieces of rounded hard plastic (a practice which seems to have started in the 1960s), or are built with synthetic fabrics stretched over minimalist alloy frames. Zippers have largely replaced clasps, and few suitcases are specifically waterproof. Perhaps most importantly, suitcases come in two distinct sizes—“carry-on” or “check-in”—both of which tend to come with wheels. Essentially all these developments came in the last half-century or so, particularly with the onset of mass aviation. Unlike transportation by automobile, which takes a traveler from door to door, a long flight can require half a mile of walking during check-in, layovers, and arrival. And while a ship's hold or luggage car could store large amounts of luggage regardless of shape, an airplane's stowage areas have specific proportions and size limits. The suitcase had to adapt, as a 1970 patent by Bernard Sadow explained: Whereas formerly luggage would be handled by porters and be loaded or unloaded at points convenient to the street, the large terminals of today, particularly air terminals, have increased the difficulty of baggage handling. Thus, it is often necessary for a passenger to handle his own baggage in an air, rail, or bus terminal. Further, where the passenger does handle his own luggage, he is often required to walk very great distances. Sadow's patent, as you might have guessed, was the crucial innovation of the wheeled suitcase. 1970 may seem remarkably recent for such a useful development. (A wheeled trunk was patented in 1887, and a wheeled suitcase in 1945—those initial models simply didn't catch on). We have to remember that aviation had only recently become truly widespread, though: in the two decades before the patent, flights had increased their passenger totals by ten times, from 17 million in 1949 to 172 million in 1969. That was also the year that set records for the most hijackings in a year, with an astonishing 82—a fact which contributed to increasingly strict baggage checks that funneled passengers through longer lines on the way to centralized security checkpoints. Luggage design remains tightly linked to aviation. Carry-on luggage (which, by the way, was transformed in 1987 with the wheeled “Rollaboard” bag and its now-ubiquitous collapsible handle) conforms to the dimensions of the airlines with the smallest storage area. When new weight restrictions for checked bags kicked in during the 2000s, meanwhile, practically every luggage manufacturer released new lightweight models to stay competitive. These suitcases tend to be vertical instead of horizontal, because of their wheels, and relatively stout and thick, because of airline restrictions on suitcase dimensions. There's an irony to the shape of these modern suitcases. They've come a long way from the flat and stackable “dress-suit case,” shaped like a big hardback book. Today's luggage instead fits the rough proportions of a big shoe box—and this gives it almost the same shape as those unwieldy trunks that Phileas Fogg preferred to leave at home. A century of revolution in transportation, in other words, seems to have brought us back to the hefty trunk shape that the first suitcases replaced. Just as we might pack and re-pack our belongings to fit our luggage, we make and re-make our luggage to fit our built world. Daniel A. Gross is a freelance journalist and public radio producer based in Boston.
2ce5d98d99d6f0873559a1ca6b431f0a
https://www.smithsonianmag.com/history/history-mar-a-lago-180965214/
The Ironic History of Mar-a-Lago
The Ironic History of Mar-a-Lago Within 48 hours after the presidential election last November, the Palm Beach Daily News headlined a question that “many in town” were asking: “Trump’s Mar-a-Lago: Another Winter White House?” By January, the president-elect had an answer: “Writing my inaugural address at the Winter White House, Mar-a-Lago,” he tweeted from his elite private club, along with a photograph of himself seated behind a large desk, legal pad and pen in hand. Palm Beach might have been having déjà vu, and not only because President-elect John F. Kennedy wrote his inaugural address at his father’s estate in the town’s North End. The woman who built Mar-a-Lago in the 1920s and presided over it for almost half a century, Marjorie Merriweather Post, had gone to great lengths to turn the mansion into an official wintertime presidential retreat. But even extreme wealth has its limitations, as my visit to the Post Family Papers suggests. They occupy 57 seldom-seen linear feet at the University of Michigan’s Bentley Historical Library and document the life of one of the most famous and consequential women of the 20th century. The files offer unusual glimpses of the girl who glued labels onto packages of Postum, the coffee substitute that made her family’s fortune, and of the woman who built the General Foods Corporation. Her four husbands, her bountiful philanthropy, her megayacht, her grand balls, her jaw-dropping jewels—all are documented in the archives. And then there’s a volume bound in still-handsome red leather. A yellowing file card dated “February/March 1976” is taped to the cover: “Original Proposal for Disposition of Mar-a-Lago.” The mansion dates to the 1920s, when Palm Beach’s wealthiest visitors were forsaking luxury hotels for their own digs, says Debi Murray, chief curator of the Historical Society of Palm Beach County. Post herself explored the site of her future home, on 17 acres of scrub between Lake Worth and the Atlantic. (Mar-a-Lago means “Sea to Lake” in Spanish.) Construction began in 1923 and kept some 600 workers busy, even though, as Murray notes, “Florida entered the Depression earlier than the rest of the country.” The mistress ensured that her workers wouldn’t go hungry. This is a beautifully illustrated account of the three main homes of Marjorie Merriweather Post (1887–1973), through the 1950s to 1970s. Even by Palm Beach standards, Mar-a-Lago was grandiose: 58 bedrooms, 33 bathrooms with gold-plated fixtures (easier to clean, Post believed), an 1,800-square-foot living room with 42-foot ceilings. Its 110,000 square feet glinted with gold leaf, Spanish tiles, Italian marble and Venetian silks. All told, Post spent $7 million—somewhere north of $90 million today. It was finished in 1927. That March, Post and her second husband, Edward F. Hutton, had a few score guests over for dinner before the annual Everglades Costume Ball. The hosts wore costumes evoking the reign of Louis XVI. But there was also noblesse oblige: In 1929, when she hired the Ringling Bros. and Barnum & Bailey Circus to perform for a charity fund-raiser, she invited underprivileged children to attend. In 1944, she offered her grounds to World War II veterans who needed occupational therapy. In 1957, she opened Mar-a-Lago to the International Red Cross Ball, and the gala event has been held there many times since—but not this year. It was one of more than 20 charity events that were relocated from Mar-a-Lago or canceled after the president’s remarks on violent protests in Charlottesville, Virginia, in August. As the social seasons came and went, however, Palm Beach tastemakers’ tastes changed. The grand houses they built in the 1920s were seen as “white elephants,” Murray says, and were razed in the ’50s and ’60s. Except that isn’t how Post saw Mar-a-Lago—or Hillwood, her estate in Washington, D.C., or Camp Top­ridge, her retreat in the Adirondacks. She arranged to donate all three properties to government entities. The state of New York added some of Top­ridge’s acreage to a forest preserve but sold most of its 68 buildings to a private owner. The Smithsonian Institution, citing maintenance costs, returned Hillwood to the Post Foundation, which now runs it as a museum. And the original Mar-a-Lago proposal, the one bound in red leather, was to donate it to the state of Florida for a center for advanced scholars, but state officials also balked at the maintenance costs. By 1968, according to other papers in the archive, Post had turned to Plan B: Mar-a-Lago as winter White House, property of the United States. After she died, in 1973, at age 86, the Post Foundation pursued the idea. But in 1981, the federal government declined, for the same reason the Floridians and the Smithsonian did. Thus Mar-a-Lago went on the market. Three potential sales collapsed before Donald Trump bought it in 1985, paying a reported $8 million for the estate and its furnishings—a small fraction of the original cost, no matter how you calculate it. And after three decades and the most confounding presidential election in living memory, Marjorie Merriweather Post’s wish for her mansion came true. Mar-a-Lago is the most lavish winter White House, but for a century chief executives have tried to escape Washington when the snow flies. --By Anna Diamond President Wilson spent some of the winter of 1912-13 in Pass Christian, Mississippi, at the Beaulieu mansion, a.k.a. the “Dixie White House,” then owned by Marie Louise Ayer of New Jersey. In 1880, a previous owner, Gen. William Harney, a Mexican-American War veteran, hosted a former president—Ulysses S. Grant. This article is a selection from the November issue of Smithsonian magazine Michael Luongo is a journalist whose work has appeared in the New York Times and the Guardian.
7ce6f61c0489bf8eb8d5c10efdd41c5d
https://www.smithsonianmag.com/history/history-merit-badges-cultural-history-united-states-180970306/?page=12
How the History of Merit Badges Is Also a Cultural History of the United States
How the History of Merit Badges Is Also a Cultural History of the United States At first glance, there’s something undeniably old fashioned about the Girl Scouts and the Boy Scouts. The organizations have legacies that stretch back more than a hundred years to the days when boys were taught to tramp through the woods and girls were taught to keep a tidy home. Today some 4 million kids still wear those iconic cloth sashes dotted with merit badges—a tradition first introduced by the Boy Scouts in 1911 and the Girl Scouts in 1912. But if you look more closely at each embroidered round, you’ll discover that the scouts have been anything but static over the last century. The ever-changing roster of Girl Scout and Boy Scout merit badges forms an accidental history of American childhood, a record of what it has meant for girls and boys to “be prepared”—the eternal scouting motto—through two world wars, the Cold War and the War on Terror, through the birth of television, the dawn of the Space Age and the arrival of the internet. Often these boys and girls were our advance scouts: Boys earned a merit badge in automobiling in 1911, when barely one percent of the population owned a car. Girls earned one in Civics in preparation for the vote; it was renamed the Citizen badge with the ratification of the 19th amendment in 1920. Now, as the Boy Scouts enroll the first girls in their ranks, and the Girl Scouts introduce two dozen new STEM badges, outpacing the boys in science education, a look back at what we’ve taught our kids, from the Greatest Generation to the next one. One of 24 new STEM badges offered by the Girl Scouts—on topics from cybersecurity to robotics. This article is a selection from the October issue of Smithsonian magazine This article is a selection from the October issue of Smithsonian magazine April White is a former senior editor for Smithsonian magazine.
85dbb7c404b38554e830a35849e5bf59
https://www.smithsonianmag.com/history/history-white-picket-fence-180971635/
How Did the White Picket Fence Become a Symbol of the Suburbs?
How Did the White Picket Fence Become a Symbol of the Suburbs? In little Taylor, Mississippi, outside Oxford, a developer named Campbell McCool is building Plein Air, a 64-acre community that, in time, will include 200 wood-frame residences. Each house is advertised as traditionally Southern, most featuring wide front porches you can imagine sipping lemonade on. They have all the modern amenities a home buyer could desire, but if a customer wants a fence—and about a third do—it must be of white wooden pickets 40 inches high. Scratch-built and painted, that fence costs about $2,500, which buys not only a practical enclosure but a complicated piece of the American Dream. Plein Air is a familiar vision of suburbia, one we’ve seen in countless movies, advertisements and television shows for more than half a century. But while the pickets remained a constant, our attitudes toward them changed. In It’s a Wonderful Life, Frank Capra stages that postwar paean’s most optimistic scene, in which George Bailey woos Mary Hatch, in front of a picket fence. Forty years later, David Lynch opens his unsettling 1986 Blue Velvet with a pan down sinister pickets and overripe blossoms. And partway through the 2013 premiere of “The Americans” the camera cuts to the front yard of spies Elizabeth and Philip Jennings, set off by white pickets. “The white picket fence is a kind of shorthand for Americana,” says John Mott, production designer for the show’s first two seasons. “The point of ‘The Americans’ is what it’s like to live a fraudulent life. These people are not Americans—they’re Russian agents—but they have to blend into the American setting.” Before they crossed the Atlantic, pickets meant something completely different. In Old Europe, pickets—from piquet, French for “pointed stick or board”—were military gear, logs sharpened to shield archers from cavalry. Needing to demarcate and perhaps defend their land, New World colonists installed fences of rough pickets, bare or painted white. In the 19th century, mass production made fence parts cheaper and fancier, and the picket fence became fashionable from New England to Key West. But not everyone loved fences. In 1841 landscape design pioneer Andrew Jackson Downing denounced them as “an abomination among the fresh fields, of which no person of taste could be found guilty.” Downing lost that round; as the nation spread west, so did fencing. In the late 1800s, developers of newfangled “suburbs” briefly made the borderless front yard trendy, scholar Fred E.H. Schroeder writes in Front Yard America. But fenceless yards were no match for the Colonial Revival design movement that appeared around the time of the 1876 centennial and championed the picket fence. The modest totem of middle-class prosperity stood even through the 1930s, when many American households couldn’t afford to whitewash a fence, never mind an entire house. Blame the Cold War for doing in the picket fence. Whether seeking security, embracing new technology or dodging a tedious paint job, many ’50s-era suburbanites chain-linked their lots. But the symbolism of the white picket fence was inescapable, and it slid into popular culture as a visual shorthand for the good life. A kind, gentle America posed behind the pickets in television fantasies like “Father Knows Best” and “Leave It to Beaver”—an imaginary all-white realm in which the worst thing that could happen was Eddie Haskell teasing the Beaver. The actual fences surged in popularity again in the 1980s, revived by New Urbanist developers attempting to recreate the appearance of walkable early suburbs. The look’s persistence amuses suburbia scholar Jeff Hardwick, who sees the modern picket fence as an echo of an echo. “Everything winds up looking like a suburb that hasn’t existed in 70 or 80 years,” he says. Today picket fences are sometimes mandated by homeowner associations, a regimentation that renders a benign historic artifact alienating—the opposite of its nature. “You can see through it; if you need to, you can hop over it,” says developer McCool of the fence. “If you’re standing in your yard and someone on the sidewalk pauses, you can have a conversation.” As for the oft-invoked “good old days,” remember: Whether you’re talking about the 1980s or the 1890s, those times were no less complex than these times, when the American middle class that made the fence a hallmark occupies shaky ground. The white picket fence is so simple—a few slats affixed to horizontal rails, a gate or two—as to invite endless interpretation. But maybe we should retire the pickets as metaphor and let them do what they do best: keep kids and dogs where they belong and encourage neighborly interaction. Enough deconstruction already. Let a fence be a fence. The sod story of a growing American obsession Research by Anna Diamond and Matthew Browne This article is a selection from the April issue of Smithsonian magazine
5dfef51fcfb022f58f237d097ac5b65c
https://www.smithsonianmag.com/history/history-wives-replacing-their-dead-husbands-in-congress-180974092/
The History of Wives Replacing Their Dead Husbands in Congress
The History of Wives Replacing Their Dead Husbands in Congress Tomorrow, Marylanders in the state’s 7th congressional district will vote in a primary election to decide who will be the nominees to replace Congressman Elijah Cummings, whose death in October 2019 left open the seat he had held since 1996. Among the many names (more than 20) on the Democrats’ primary ballot is Cummings’ wife, Maya Rockeymoore Cummings, a public policy consultant and the former chair of the Maryland Democratic Party. If she wins, she’ll become part of a nearly century-long tradition of “widow’s succession,” when wives either ran or were selected to fill their husband’s vacated seats in Congress in Washington. (According to the code for the House of Representatives, vacant seats are filled through a special election; only Senate seats can be filled by governor appointment, with some exceptions.) This custom has slowed in recent years: If Rockeymoore Cummings wins the primary and then the general, she would be the first woman since 2005 to succeed a husband who died in office. But the tradition had a defining impact on the makeup of Congress in the 20th century and on female political representation. As the Los Angeles Times reported in 1998, “Among first-time House candidates between 1916-93, 84 percent of the widows won, while only 14 percent of other women were victorious. The trend was strongest when women were rarer in politics; 35 of the 95 women who served in Congress before 1976 were congressional wives first.” The trend was once so pronounced that Diane Kincaid, a political scientist who studied the topic in the 1970s, wrote, “statistically, at least, for women aspiring to serve in Congress, the best husband has been a dead husband.” Writing 25 years later, academics Lisa Solowiej and Thomas L. Brunell concurred that it “is arguably the single most important historical method for women to enter Congress.” When Congressman John Nolan of California died in mid-November 1922, after he had been re-elected to a fifth term, local leaders came to his widow, Mae Ellen Nolan, with an idea. As researcher Hope Chamberlin writes in A Minority of Members: Women in the U.S. Congress, “an unlikely coalition of influential San Francisco Republicans representing both business and labor first approached her.” Why recruit a “quiet, pleasant, businesslike” woman for the role? Chamberlin cites one political insider’s candid opinion: “The Nolan name means victory.” Nolan said at the time, “I owe it to the memory of my husband to carry on his work.” In a special election held to finish John’s term and serve the next one, she defeated six opponents and headed to Washington, where she was the first woman to head a Congressional committee (the Committee on Expenditures in the Post Office). Party leaders who recruited widows merely saw them as temporary placeholders; they “capitalized on public sympathy to ensure that the party held the seat in the interim...and helped the party avoid internal disputes and provide time to recruit a ‘real’ replacement,” write academics Barbara Palmer and Dennis Simon in Political Research Quarterly. In an interview, Debbie Walsh, director of the Center for American Women and Politics (CAWP), says that the parties assumed that husband and wife shared the same values, so they could count on the wives to uphold their husband’s politics in office. Some of the widows were content with the placeholder role that the party presumed, serving just one year or one term. After her term was over, Mae Ellen Nolan declined to run for reelection, wanting nothing more to do with Washington. “Politics is entirely too masculine to have any attraction for feminine responsibilities,” she said at the time. But many women embraced the opportunity to pursue politics themselves and surprised the men who recruited them. Kincaid identified one example in Senator Hattie Caraway of Arkansas, who filled her husband’s seat in the Senate in 1931. Kincaid wrote that Caraway “confounded the Governor who appointed her and who openly coveted the seat himself by entering the primary for renomination.” She won that election, and others, before losing a bid for reelection in 1944. According to research from CAWP, of the 39 women who entered the House of Representatives as successors to their husbands, 21 stayed on for more than two years, often sustaining illustrious careers. Among them are Representatives Edith Nourse Rogers, who sponsored the original GI bill, Florence Prag Kahn, the first Jewish woman to serve in Congress and the first to serve on the House Military Affairs Committee, Corinne “Lindy” Boggs, who championed women’s rights, and Chardiss Collins, who advocated for Medicare expansion and affirmative action. Although widows had name recognition among constituents, they still faced competitive races. “They usually had to overcome opposition for their office; nearly half have sought to retain their seats,” wrote Kincaid. “Significant power was accumulated and employed by those who extend their tenure.” Moreover, she pointed out that some widows, like Rep. Leonor Sullivan of Missouri, “have vigorously sought and/or campaigned for their husbands’ seats, and have been denied and/or defeated.” Denied the party’s support in the 1951 special election, Sullivan beat six men in the primary and won the general election the next year. As she competes for Maryland Democrats' support, Rockeymoore Cummings carries the enviable endorsement of EMILY’s List, but opponent Kweisi Mfume holds the endorsement of the Maryland State AFL-CIO. Rep. Beverly Byron, also of Maryland, was candid about her practical reasons running for her husband’s seat, which she occupied from 1979 to 1993. “In 24 hours, I became a widow, a single parent, unemployed and a candidate for Congress,'” she told the Los Angeles Times in 1998. “I knew I needed to work; it was the only job offered to me.” Many widows who went to Congress were already familiar with its working, having been party to their husbands’ world. “They had worked on their husbands’ campaigns and as a result, knew their district well,” explain Palmer and Simon. Many wives were deeply entwined with their husbands’ policy setting and political strategy. Before the powerful congressman Hale Boggs died, his wife, Lindy, “was his chief political adviser,” explains the House of Representatives archives. “She set up her husband’s district office in New Orleans, orchestrated his re–election campaigns, canvassed voters, arranged for her husband’s many social gatherings, and often acted as his political surrogate as demands on his time became greater the further he climbed in the House leadership.” Some widows’ tenures in D.C. came to overshadow their husbands’ legacies. Perhaps most notable was Senator Margaret Chase Smith, a famous and formidable politician who spoke out against Senator Joseph McCarthy’s redbaiting. She originally went to Congress in 1940 to fill her husband Clyde’s seat and, after her election to the Senate in 1964, she made history as the first woman to serve in both chambers. She lost her last election in 1972, when she was in her mid-70s. Today, just one widow successor sits in Congress: Rep. Doris Matsui from California. (Matsui is a member of the Smithsonian Board of Regents.) Rep. Debbie Dingell became the first woman to succeed her retiring husband in his congressional seat (John stepped down in 2015 and passed away in 2019). To date, no widower has succeeded his wife. Widow’s succession “used to be the norm and it is now quite clearly the exception,” says Walsh. “In those early days, these women's lives and careers were probably incredibly closely intertwined with their husbands. They didn't really have their own careers separate from their husbands’ political career.” “For a lot of women” these days, she continues, “they have their own lives, their own careers. And they may not be available…to just step in and take his job.” But for Rockeymore Cummings, her career aligns with her husband’s and her political ambition predates his death. She was the chair of the Maryland Democratic Party and was a onetime candidate for governor, before dropping out when Cummings was hospitalized. As she faces down her many many fellow Democrats in a crowded primary, she echoes widows before her, like Mae Nolan. As she said to CNN, “I’m now running to build on his legacy in Congress.” But it’s just as likely, should she win in the primary, that she’ll make the seat her own. Anna Diamond is the former assistant editor for Smithsonian magazine.
b2cbde097d63774e7ec46b83cc468649
https://www.smithsonianmag.com/history/hit-by-a-bus-how-ben-hogan-hit-back-24870580/
Hit by a Bus, How Ben Hogan Hit Back
Hit by a Bus, How Ben Hogan Hit Back On the damp and chilly morning of Wednesday, February 2, 1949, Ben Hogan got up before the sun and hit the El Capitan Motel coffee shop in Van Horn, Texas. He and his wife, Valerie, had driven more than 500 miles east from Phoenix the day before, and while the road made his wife queasy, he craved a quick breakfast, and they still had to go 500 miles east to Forth Worth. Ben ate, went back to their room and packed the Cadillac with their luggage and his golf clubs. Ben Hogan had reached the pinnacle of his career. For the first time, the diminutive golfer had captured two major tournaments in the same year—the U.S. Open and the PGA Championship. Two weeks earlier, his face had appeared on the cover of Time magazine, above the quotation that would define him: “If you can’t outplay them, outwork them.” Hogan had been working for as long as he could remember. In 1922, when he was 9, his father, a blacksmith named Chester, pointed a gun at his chest and committed suicide. Hogan biographer James Dodson says some reports place Ben in the room of their home in Fort Worth, Texas, at the time. The loss of the family breadwinner meant the Hogan children had to contribute financially. Ben sold newspapers at the train station, then became a caddy at a nearby country club. He was 11. When he wasn’t carrying bags, he spent countless hours on the practice range. Digging hundreds of balls out of the dirt, day after day, he worked to the point where, legend had it, his hands would bleed. He sought to hit a perfectly controlled ball, and to achieve a repeatable swing that would hold up under pressure. Perhaps it allowed him to feel a measure of control over the chaos around him. Whatever, he could be found on the range long after his fellow caddies, and ultimately his fellow competitors, had left the golf course. In 1949, even the best professional golfers drove thousands of miles each year to tournaments across the country, lugging not just their clothes and clubs, but their families. By February 1949, Hogan had driven more than 3,000 miles since the start of the golf season, and he’d won two of his first four tournaments. He was leading the tour on the money list in what promised to be another remarkable year–but he told Time, “It’s the traveling.  I want to die an old man, not a young one.” Ben and Valerie Hogan pulled out of the parking lot at the El Capitan in sunshine, heading east along two-lane Highway 80. They hadn’t gone ten miles when they ran into a dense fog and a slick, icy film on the road. Hogan cut his speed to 25 miles per hour; then he saw “four lights winking at me.” A Greyhound bus was trying to pass a truck, filling Hogan’s lane. He looked to veer off the road but saw a culvert on his right. “I knew we were going to get hit,” he said. The Greyhound plowed head-on into Hogan’s Cadillac.  At the last second, the golfer hurled himself across his wife. “That was the first break I got in all this trouble,” Hogan later said. The steering wheel and part of his car’s engine was “hammered thru the cushion on my side of the seat.”  If he had stayed where he was, he was convinced, he’d have been crushed. Hogan blacked out upon impact; Valerie was dazed but remained conscious. Both of them were pinned against the dashboard. She managed to lower the passenger-side window and began screaming for help as Ben slipped in and out of consciousness. He moaned and told her to “Get out!”  He was afraid the car was going to catch fire. Valerie freed herself and raised Ben to a sitting position. Another driver came along, and together they pulled the golfer from the Cadillac. It took ninety minutes for an ambulance to arrive. As Hogan was lifted in, he asked his wife if his golf clubs were accounted for. They were. Word had quickly spread that Ben Hogan had been killed. Some of his fellow golfers, playing in a pro-am tournament in Arizona, walked off the course mid-round upon hearing the false news. Later that day, Hogan’s friends were informed that he was alive but in critical condition, and some of them made it to the Hotel Dieu Hospital in El Paso. Valerie seemed to be fine, despite the bruises on her face and various cuts, but they saw Ben strapped to the bed, covered in gauze.His face was cut and bruised, and his left eye was practically swollen shut. Doctors had diagnosed Hogan with a fractured left collarbone, a double fraction of his pelvis, a broken ankle and a chipped rib. After setting his bones, doctors expected him to go home in a few weeks.  A “complete recovery” was possible, they said, within two months—mostly due to “Ben’s fighting heart.” But before Hogan could leave, his lungs gave doctors cause for concern; he had severe chest pains. Blood clots had formed in his legs after two weeks in bed, and by the end of February, doctors discovered that one clot had traveled to his lung. They gave him several blood transfusions, then performed abdominal surgery to tie off the inferior vena cava—the large vein that carries blood from the lower half of the body to the heart. Hogan would spend another pain-filled month in the hospital, unable to leave his bed.  A wiry 137 pounds at the time of the accident, he dropped nearly 20 pounds during his stay.  A return to the golf course was no longer seen as certain. It was March 29, 1949, before Hogan made it home to Fort Worth. He passed the summer trying to regain his strength.  He was too weak to swing a club, and even short walks wore him out. The procedure on his vena cava caused chronic pain, swelling and fatigue—conditions that would plague him for the rest of his life.  But he was determined to work as hard on his recovery as he was his golf swing. “It’s going to be a long haul,” he told reporters, “and in my mind, I don’t think that I’ll ever get back the playing edge I had last year. You work for perfection all your life, and then something like this happens.  My nervous system has been shot by this, and I don’t see how I can readjust it to competitive golf. But you can bet I’ll be back there swinging.” “Don’t believe a word of it,” Valerie said. “Ben will be himself again, bones, nerves and all.” Sam Snead, Cary Middlecoff and a young golfer named Arnold Palmer battled for headlines in the summer of 1949, while Hogan shuffled around his house. He was named non-playing captain of the U.S. Ryder Cup team and traveled to England for the matches, where he delighted fans by putting on the practice green. It was the most he could do, seven months after the accident. Reporters described him as “crippled.” But returning to the States, Hogan began to regain some strength. Then he began to practice. By June of 1950, 16 months after the accident, Bantam Ben was back on the course, this time trying to reclaim his place as golf’s greatest competitor in American golf’s biggest tournament—the U.S. Open at Merion Golf Club in Pennsylvania. He had played several tournaments leading up to the Open, but on the third and final day of grueling competition, he began to wilt under 36 holes of golf in the heat, and his lead began to evaporate on the final few holes. With everything on the line, Hogan needed to hit an impossibly long shot from the fairway to make par on 18th and final hole. A packed gallery formed a silent gauntlet around him as he practically staggered to his ball, according to eyewitnesses. Judging the yardage, Hogan reached for his one iron—the most difficult club in his bag to hit. The old joke goes that if you’re ever in a lightning storm, the safest thing to do is to hold up your one iron, for even God can’t hit a one iron. Hogan steadied himself over the ball, slowly began his backswing, unleashed his power and sent the ball flying. The crowd around him gasped at the sound of his shot and the sight of the ball heading toward the flag. Hogan went on to par the hole and force a three-way playoff. After getting a good night’s sleep, he easily won the U.S. Open the following day, the only player of the three to shoot a round under par. The tournament represented Hogan’s rebirth: He would go on to dominate golf like never before, winning in 1953 the unprecedented “Hogan Slam” of three straight major tournaments.  (He did not play in the fourth major—the PGA Championship—because he did not want to walk more than 18 holes a day.)  The car crash, and Hogan’s near death, many of his friends later said, made him a more outgoing and compassionate man. But despite everything he accomplished on the course after his accident, Hogan was convinced he had come as close to perfection in the months before the crash. His post-crash golf swing, recorded on film, is still used as an example of near-perfect ball striking and mechanics.  Only Hogan himself disagreed.  “I was better in 1948 and ’49 than I’ve ever been,” he said, years later. Sources Articles: “Golfer Ben Hogan Injured in Car Crash,” Chicago Daily Tribune, February 3, 1949.  ”Hogan, Wife Tell of Texas Auto Crash,” Chicago Daily Tribune, March 30, 1949. “Hogan Faces Stern Fight in Hospital,” Hartford Courant, March 4, 1949.  ”Golfer Hogan Winning His Hardest Match of All,” Chicago Daily Tribune, March 29, 1949.  ”Remarkable Hogan Wins ’50 U.S. Open,” by Larry Schwartz, ESPN Classic, November 19, 2003. “Hogan’s Return: Back From Tragedy to Win the 1950 U.S. Open,” by Damon Hack, Golf.com, October 20, 2008, “Hogan Majored in Courage,” by Larry Schwartz,  ESPN’s Sports Century, “What could Have Been,” by Jaime Diaz, Golf Digest, June, 2009.  ”Ben Hogan’s Wife Remembers Husband as Exhibit Opens in USGA Museum,” Associated Press, June 9, 1999, Books: James Dodson, Ben Hogan: An American Life, Doubleday, 2004. Curt Sampson, Hogan, Rutledge Press, 1996. Gilbert King is a contributing writer in history for Smithsonian.com. His book Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America won the Pulitzer Prize in 2013.
4f6cf8807f64f1e6062b3e34dba3632e
https://www.smithsonianmag.com/history/holocaust-and-hungary-prime-minister-180964139/
Why It Matters That Hungary’s Prime Minister Denounced His Country’s Role in the Holocaust
Why It Matters That Hungary’s Prime Minister Denounced His Country’s Role in the Holocaust On an early page of Night, Elie Wiesel’s autobiographical account of the Holocaust, he recalls the Hungarian police’s orders as they echoed throughout his small Jewish ghetto. “Faster! Faster! Move, you lazy good-for-nothings!,” they screamed. “That was when I began to hate them, and my hatred remains our only link today,” he writes. “They were our first oppressors. They were the first faces of hell and death.” Wiesel’s family was not unique. Before the war’s end, the country’s leaders and its people would be responsible for the deaths of hundreds of thousands of Jews, Roma people and other “undesirables.” Some lived within Hungary’s official post-World War I borders, while others, including Wiesel and his family, lived in annexed territory that was part of the former Austria-Hungarian Empire. Hungary’s culpability in the Holocaust is undeniable. Yet in the years since the Cold War, the nation has fielded heavy criticism by Holocaust scholars who say the country is shifting from acknowledging that complicity to portraying itself as a helpless victim of Nazi occupation. Recently, though, when Israeli prime minister Benjamin Netanyahu visited Budapest (the first Israeli prime minister to do so since 1989), Hungarian prime minister Viktor Orbán​ made headlines during a joint press conference when he denounced his country’s relationship with Nazi Germany during World War II. “[A]t the time we decided that instead of protecting the Jewish community, we chose collaboration with the Nazis,” Orbán said, according to the Associated Press. “I made it clear to [Netanyahu] that this can never happen again. In the future, the Hungarian government will protect all its citizens.” Orbán’s statement came days after Hungary’s government received major blowback for launching an anti-migrant campaign with posters depicting the face of Hungarian-born Jewish billionaire George Soros and praising Hungary’s controversial World War II leader, Miklós Horthy​. This admission of guilt and call for reconciliation was a noticeable step for the government, which has been criticized for celebrating nativist politicians and writers with anti-Semitic backgrounds. It also contrasted to how the Orbán government has characterized Hungary’s role in the Holocaust in the past. During Hungary’s commemoration of the 70th anniversary of the events of 1944, when the Nazi army entered Hungary, the government erected a monument in Budapest’s Liberty Square. Titled “Memorial to the victims of the German occupation,” it depicts an eagle with sharp talons, signifying Nazi Germany, swooping down and attacking the archangel Gabriel, who symbolizes the Hungarian people. The statue was emblematic of the fight in Hungary over its history. Critics called the interpretation a whitewashing of the role that Hungary’s government and civilians had in the crimes of the Holocaust. They believed it equated all Hungarian suffering as equal and demanded the removal of the statue. The government denied the accusations and refused to remove the monument. The statue still stands in the square, illustrating the deep divide that remains in the county, which is still struggling to reconcile with its history. **** Long before that fateful spring of 1944, Hungarian leader Miklós Horthy​ had fostered anti-Semitic fervor in his country. When he first took power in 1920, the country’s Numerus Clausus law, which put a quota on the number of Jewish students allowed to attend universities, went into effect, along with the White Terror, a military crackdown targeting at Jews and other counterrevolutionaries. In the build-up to World War II, a series of anti-Jewish laws starting in 1938 were also responsible for othering Hungarian Jews. But the alliance Hungary struck with the Axis Powers in 1940 at first kept the majority of Hungary’s Jews safe from Nazi Germany.  More than 20,000 Jews that Hungarian authorities designated as “foreign nationals” were sent in 1941 to German-occupied Ukraine, with full knowledge of the fate that would await them upon their arrival. The next year, the Hungarian military and citizen forces took part in the Novi Sad massacre in northern Serbia where more than 1,000 people, mostly Jews, were killed. And approximately 40,000 Jewish men conscripted into forced labor battalions died of exposure, enemy fire or mass executions during Hungary’s retreat from Stalingrad in early 1943. Still, unlike much of Europe, most of Hungary’s Jews remained alive in the spring of 1944. As an official ally of the Axis powers, Hitler had left Hungary to find its own solution to the “Jewish Question” up until this point. Now, the Fuhrer demanded its Jews. That spring, with the Soviet army advancing on Hungary's border, and Hungary’s own army largely destroyed at Stalingrad, Nazi troops first entered Hungary’s borders. They came without resistance. Horthy invited the Fuhrer’s troops into the country, and then verbally agreed to send what was initially 100,000 Jews to Germans for “work” in a bid to remain in power. Compounding that number, Horthy decided instead to send the workers’ families as well, ultimately sealing the fates of some 437,000 Jews. “[Horthy’s] involvement is absolutely clear because it's his government that does it, and his oral instruction that does it,” Paul Shapiro, director of the United States Holocaust Memorial Museum’s Center for Advanced Holocaust Studies, tells Smithsonian.com. “Everyone knew in the spring of 1944 what transporting Jews into German hands meant.” Horthy and Hungary were in an impossible situation, but as Robert Rozett, director of the Yad Vashem Libraries writes in Tablet with only some 150 Nazi Germans in charge of the deportations, it was left to officials of the Hungarian Interior Ministry, the Gendarmes and local authorities to carry out their orders. Rather than refuse to be complicit, Hungarians chose to cooperate. “The Germans pushed for concerted action against Hungarian Jewry, and Horthy not only did not resist—he put the government apparatus at their disposal. The well-oiled process of destruction of the Jews followed quickly: restrictions, wearing the Jewish badge, confiscations, the establishment of ghettos and systematic deportations,” Rozett writes. It took until July, with the Allies’ continued victories showing how the war would end, for Horthy to order a stop to the deportations and open armistice negotiations with the Soviets, says Shapiro. Only then did Hitler prop up a government takeover, starting the fascist Arrow Cross Party’s reign of terror. During their rule, Arrow Cross members targeted the Budapest Jews, the only Jews who remained in Hungary near the end of the war. Horthy had spared them in his sweep, but as The Economist writes, the reason for this act wasn’t necessarily born out of compassion. Rather, Horthy had been warned that he was in danger of being tried for war crimes if deportations continued. The Arrow Cross Party committed unspeakable crimes and killed or deported an estimated 100,000 Jews before Soviet troops took control of the country in 1945. Their deeds cast a black mark on Hungary’s history, but the puppet government wasn’t alone in spreading terror in the country. If the narrative of Hungary and the Holocaust is told accurately, Horthy and those who worked with the government have the blood of more than 400,000 on their hands. *** Wiesel, for his part, didn’t return to Hungary until in 2009. Donning a blue yarmulke and black trench coat, the then-81-year-old lit a candle at the Holocaust Memorial and Documentation Center in Budapest. Photographers captured the moment that Wiesel kneeled down, his shadow reflected against the center’s granite walls. There, the names of Hungarian victims killed in Holocaust were etched. Somewhere on the walls were the names of Wiesel’s younger sister, mother and father. Wiesel’s trip came at a turning point for Hungarian memory and the Holocaust. The state-of-the-art center had opened just five years before, in 2004. At the time, the museum symbolized a new era of openness in documenting Hungary’s role in the Holocaust. Following the fall of the Soviet Union and the start of free elections in Hungary in 1990, Hungary had taken strides to take accountability for its actions. During a 50th anniversary commemoration of the Hungarian Holocaust in 1994, political leaders officially apologized for the government’s complicity in the “Final Solution.” Hungary’s coalition government went on to establish a national Holocaust Commemoration Day. Hungary also joined the international task force on Holocaust research and commissioned the creation of the state-run Holocaust Memorial and Documentation Center. But while Hungary in the early 2000s showed signs of promise for its work memorializing its past, it also carried seeds of its future. Across Hungary, Skinheads clad in Nazi-like uniforms would begin to evolve into the Jobbik party, Hungary’s extreme far-right, nativist group. A fringe faction at the time, they would soon enough prove capable of getting 20 percent of votes come the 2014 Parliamentary elections. At a keynote address delivered before the Hungarian National Assembly, Wiesel spoke about his fears for the country’s future. "Wherever in the world I come and the word Hungary is mentioned, the next word is anti-Semitism," he said. "I urge you to do even more to denounce anti-Semitic elements and racist expressions in your political environment and in certain publications." The call to action, though, was in vain. Hungary’s failing economy had created a welcoming environment for far-right, nativist sentiments. *** This month, a new party is rising to the right of the Jobbik ticket. Criticizing the Jobbiks for moving to a more publicly centric ticket, the group, which calls itself Force and Determination, says it represents "the white European man" and seeks to spread the idea of "ethnic self-defense.” "We don't want to muse about the past — there is only forward. We must believe that even for us there is an empty page in the history book,"  a member of the new group told the Associated Press. The apathetic attitude toward history goes beyond this new far-right party.  The state-run Holocaust memorial and museum, despite its promising start, has suffered decimating funding cuts. As Beáta Barda, curator of Hungary’s Trafo House of Contemporary Art and Association of Independent Performing Artists wrote in an email to Smithsonian.com in the fall, “It is a dead institution, a kind of must for certain schools, no programmes, we are just a corner away, and [it’s] as if it never existed.” Instead, visitors are directed to the “House of Terror,” a state-sponsored propaganda museum built in 2002 that tells the state-sanctioned story of Hungary and the Holocaust. In one display, it does so literally—an exhibit rotates a figure dressed in a Nazi Uniform on one side and a Soviet Uniform on the other to conflate Nazism and Fascism and Communism. Before his death, Wiesel, outraged that Hungarian government officials had attended a reburial of a writer who was a member of the Arrow Cross Party, penned a final public letter in protest of its actions where he explained why he felt compelled to return a state award once given to him with much celebration. He did not live to see the Hungarian government bestow a similar award of state import—the Order of Merit of the Knight’s Cross—to Zsolt Bayer, a racist, anti-Semitic journalist who has referred to Jewish people as “stinking excrement.” The government justified the honor last summer by claiming it was for the “exploration of several national issues” and “as a recognition of his exemplary journalistic work,” The Hungarian Spectrum reported at the time. In response, more than 100 past recipients (and counting) of Hungarian state awards returned their own honors in outrage, viewing the Bayer incident as yet another example of the government’s implicit encouragement of anti-Semitism. Orbán’s recent decision to speak out about Hungary’s culpability in the Holocaust along with his vow to Netanyahu to fight anti-Semitism in the country today is notable by comparison. But if Orbán ​wants to be taken at his word, there is much work to be done. Jacqueline Mansky is a freelance writer and editor living in Los Angeles. She was previously the assistant web editor, humanities, for Smithsonian magazine.
d0693a9cc68d095b8d78b4a0d6b8d868
https://www.smithsonianmag.com/history/how-a-new-yorker-article-launched-the-first-shot-in-the-war-against-poverty-17469990/
How a New Yorker Article Launched the First Shot in the War Against Poverty
How a New Yorker Article Launched the First Shot in the War Against Poverty On January 19, 1963, the New Yorker published a 13,000-word essay, “Our Invisible Poor,” the longest book review the magazine had ever run. No piece of prose did more to make plain the atrocity of poverty in an age of affluence. Ostensibly a review of Michael Harrington’s book The Other America, which had all but disappeared since its publication in 1962, “Our Invisible Poor” took in a slew of other titles, along with a series of dreary economic reports, to demonstrate these facts: The poor are sicker than everyone else, but they have less health insurance; they have less money, but they pay more taxes; and they live where people with money seldom go. What Dwight Macdonald explained was how a rising American middle class could have failed even to see poverty. “There is a monotony about the injustices suffered by the poor that perhaps accounts for the lack of interest the rest of society shows in them,” Macdonald wrote. “Everything seems to go wrong with them. They never win. It’s just boring.” “Our Invisible Poor” is not boring. It is frank. “The poor are even fatter than the rich.” It is courageous. “The federal government is the only purposeful force,” he insisted, “that can reduce the number of the poor and make their lives more bearable.” And it is smart. What Macdonald did, in a way that few people do anymore, was to digest a complex and specialized field of academic scholarship for a popular audience. He cared about facts and evidence. He just didn’t like the way academics wrote: without force, without passion and without, apparently, the ability to tell the difference between an important finding and a mind-bogglingly obvious one. “Although it is impossible to write seriously about poverty without a copious use of statistics,” Macdonald insisted, “it is possible to bring thought and feeling to bear on such raw material.” He knew how to sting. The Other America sold 70,000 copies the year after Macdonald’s essay was published (the book has since sold more than a million copies). “Our Invisible Poor” was one of the most widely read essays of its day. Walter Heller, chairman of the Council of Economic Advisers, gave John F. Kennedy a copy. The president charged Heller with launching a legislative assault on poverty. After Kennedy’s assassination, Lyndon B. Johnson took up that charge, waging a war on poverty. He lost that war. In the years since, with the rise of a conservative movement opposed to the basic tenets of Macdonald’s interpretation and Johnson’s agenda, the terms of the debate have changed. Government, Macdonald believed, was the solution. No, Ronald Reagan argued, citing the failures of Johnson’s War on Poverty, government is the problem. “The worst part of being old and poor in this country,” Macdonald wrote, “is the loneliness.” Something, he knew, had to be done. He wanted everyone who read “Our Invisible Poor” to see that, too. The problem is, we’ve never been able to agree about who ought to do it.
da9880021aefb7461754bb61ae74726f
https://www.smithsonianmag.com/history/how-america-grappled-immigration-100-years-ago-180962058/
Literacy Tests and Asian Exclusion Were the Hallmarks of the 1917 Immigration Act
Literacy Tests and Asian Exclusion Were the Hallmarks of the 1917 Immigration Act “There is an old immigrant saying translated into many languages that goes, ‘America beckons, but Americans repel,’” says Alan Kraut, a professor of history at American University in Washington, D.C. The political debate today over the flow of immigrants through U.S. borders merits a look back to 100 years ago, when Congress overrode a presidential veto to pass the Immigration Act of 1917, the most sweeping version of that type of legislation the country had ever created. The United States has always grappled with how to promote pluralism and protect its citizens at the same time—and the fight from a century ago was no different. In the years leading up to the act, millions of immigrants from Europe poured into the U.S., with 1.3 million passing through Ellis Island in 1907 alone. During that period, the immigrants filled gaps in the nascent industrial economy, making up the majority of workers in Pennsylvania coal fields, Chicago stockyards and New York garment factories. But Congress, acting upon decades of xenophobic and economic concerns and the emergent “science” of eugenics, saw the matter differently. It had attempted to pass laws curbing the flow from Europe numerous times; an English literacy test component actually passed in the House on five occasions and the Senate on four, but was twice vetoed by Presidents Cleveland and Taft. The test was a part of the 1917 act, as was the expansion of an “undesireable” list that included epileptics and political radicals. The act also levied an $8 tax on every adult immigrant (about $160 today) and barred all immigrants from the “Asiatic zone.” Congress voted to override President Wilson's veto of the act in 1916. Wilson himself was ambivalent on immigration, having earlier said, “We are going to keep our doors wide open so that those who seek this thing from the ends of the earth may come and enjoy it.” But he also agreed with some provisions of the act, and found fault mainly in one aspect of the bill, “I cannot rid myself of the conviction that the literacy test constitutes a radical change in the policy of the Nation which is not justified in principle.” Alabama congressman John L. Burnett, who was chairman of the House Committee on Immigration and Naturalization, reintroduced the literacy component of the bill multiple times. Burnett also made up part of the Dillingham Commission, a four-year investigation of immigration that ended in 1911 and concluded immigrants from southern and eastern Europe posed a serious threat to American society. The 1917 act built on previous legislation, including the Chinese Exclusion Act of 1882 and the Gentlemen’s Agreement of 1907, which was an informal system for regulating immigration from Japan. Much of the justification for this targeted exclusion—particularly of Asians—was based on racism and the dubious pseudoscience of eugenics researchers like Madison Grant, who wrote The Passing of the Great Race in 1916. “To admit the unchangeable differentiation of race in its modern scientific meaning is to admit inevitably the existence of superiority in one race and of inferiority in another,” Grant wrote. “The Anglo-Saxon branch of the Nordic race is again showing itself to be that upon which the nation must chiefly depend for leadership, for courage, for loyalty, for unity and harmony of action.” It was such a widespread belief that the U.S. Surgeon General and senior members of the Public Health Services (whose duties included medical inspections of passengers disembarking at Ellis Island) were publicly aligned with eugenics in 1914. “Eugenics was something that very bright, intelligent people talked about in the same way that we talk [today] about genetic engineering,” says Kraut. Proponents of eugenics advocated “marriage patterns and sterilization so the best people, as they defined it, prospered and had many children, and that would make society better.” The literacy test, while not as direct a ban as the Asiatic barred zone, also had its roots in eugenics and the desire for a “superior stock.” The original version of the literacy test required reading and writing a short passage of the U.S. Constitution. But it was remarkably unsuccessful in weeding out newcomers. As actually implemented, the test required reading only short passages in any language, and if a man was literate and his wife and children weren’t, they all still earned access to the country.  Supporters believed it would’ve reduced the number of new arrivals (mainly from eastern and southern Europe) by more than 40 percent. In reality, only 1,450 people of 800,000 immigrants between 1920 and 1921 were excluded on the basis of literacy. Due in part to the act’s failure to cull greater numbers from the flow of immigrants, a new system was put into place in 1921 and then revised in 1924. The act relied on quota systems for each country of origin. The countries could only provide immigration visas to 2 percent of the total number of people of each nationality in the U.S. as of the 1890 census, and the law continued to completely exclude East Asia. The quota system meant more than 50,000 Germans could come to the country annually, but less than 4,000 Italians were allowed, compared to the peak of over 2 million immigrants from Italy between 1910 and 1920. This ambivalence about immigration is almost as American as immigration itself, Kraut says. Americans recognize the contributions immigrants make, but there’s also a sense of economic and moral competitiveness. “We’re constantly changing, expanding and contracting,” Kraut says. “Right now Mr. Trump has us in a period where we seem to be looking inward and contracting.” But he sees the recent airport protests as a sign that the issue is as contentious as ever. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
ecff8594b3dbb84f7b84632e7116a794
https://www.smithsonianmag.com/history/how-dolley-madison-saved-the-day-7465218/
When Dolley Madison Took Command of the White House
When Dolley Madison Took Command of the White House In the years leading up to America’s second war with Britain, President James Madison had been unable to stop his penny-pinching secretary of the treasury, Albert Gallatin, from blocking Congressional resolutions to expand the country’s armed forces. The United States had begun the conflict on June 18, 1812, with no Army worth mentioning and a Navy consisting of a handful of frigates and a fleet of gunboats, most armed with a single cannon. In 1811, Congress had voted to abolish Alexander Hamilton’s Bank of the United States, making it nearly impossible for the government to raise money. Worst of all, the British and their European allies had engaged (and would ultimately defeat) Napoleon’s France in battles across Europe in 1812 and 1813, which meant the United States would have to fight the world’s most formidable army and navy alone. [×] CLOSE Video: How Dolley Madison Saved George Washington In March 1813, Gallatin told the president, “We have hardly money enough to last till the end of the month.” Along the Canadian border, American armies stumbled into ruinous defeats. A huge British naval squadron blockaded the American coast. In Congress, New Englanders sneered at “Mr. Madison’s War,” and the governor of Massachusetts refused to allow any of the state’s militiamen to join the campaign in Canada. Madison fell ill with malaria and the aged vice president, Elbridge Gerry, grew so feeble that Congress began arguing about who would become president if both men died. The only good news came from victories over lone British warships by the tiny American Navy. Dolley Madison’s White House was one of the few places in the nation where hope and determination continued to flourish. Although she was born a Quaker, Dolley saw herself as a fighter. “I have always been an advocate for fighting when assailed,” she wrote to her cousin, Edward Coles, in a May 1813 letter discussing the possibility of a British attack on the city. Spirits had risen when news of an American victory over the British frigate Macedonian, off the Canary Islands, reached the capital during a ball given in December 1812 to celebrate Congress’ decision to enlarge the Navy at last. When a young lieutenant arrived at the ball carrying the flag of the defeated ship, senior naval officers paraded it around the floor, then laid it at Dolley’s feet. At social events, Dolley strived, in the words of one observer, “to destroy rancorous feelings, then so bitter between Federalists and Republicans.” Members of Congress, weary of flinging curses at each other during the day, seemed to relax in her presence and were even willing to discuss compromise and conciliation. Almost all their wives and daughters were Dolley’s allies. By day Dolley was a tireless visitor, leaving her calling cards all over the city. Before the war, most of her parties attracted about 300 people. Now attendance climbed to 500, and young people began calling them “squeezes.” Dolley undoubtedly felt the stress of presiding over these crowded rooms. “My head is dizzy!” she confessed to a friend. But she maintained what an observer called her “remorseless equanimity,” even when news was bad, as it often was. Critics heaped scorn on the president, calling him “Little Jemmy” and reviving the smear that he was impotent, underscoring the battlefield defeats over which he had presided. But Dolley seemed immune to such slander. And if the president looked as if he had one foot in the grave, Dolley bloomed. More and more people began bestowing a new title on her: first lady, the first wife of a U.S. president to be so designated. Dolley had created a semipublic office as well as a unique role for herself and those who would follow her in the White House. She had long since moved beyond the diffidence with which she had broached politics in her letters to her husband nearly a decade before, and both had jettisoned any idea that a woman should not think about so thorny a subject. In the first summer of his presidency in 1809, Madison had been forced to rush back to Washington from a vacation at Montpelier, his Virginia estate, leaving Dolley behind. In a note he wrote to her after returning to the White House, he said he intended to bring her up to date on intelligence just received from France. And he sent her the morning newspaper, which had a story on the subject. In a letter two days later, he discussed a recent speech by the British prime minister; clearly, Dolley had become the president’s political partner. The British had been relentless in their determination to reduce Americans to obedient colonists once more. Checked by an American naval victory on Lake Erie on September 10, 1813, and the defeat of their Indian allies in the West, almost a month later, the British concentrated their assault on the coastline from Florida to Delaware Bay. Again and again their landing parties swarmed ashore to pillage homes, rape women, and burn public and private property. The commander of these operations was Sir George Cockburn, a strutting, red-faced rear admiral, widely considered to be as arrogant as he was ruthless. Even as many Washington residents began packing up families and furniture, Dolley, in correspondence at the time, continued to insist that no British Army could get within 20 miles of the city. But the drumbeat of news about earlier landings—British troops had sacked Havre de Grace, Maryland, on May 4, 1813, and tried to take Craney Island, near Norfolk, Virginia, in June of that year—intensified criticism of the president. Some claimed that Dolley herself was planning to flee Washington; if Madison attempted to abandon the city as well, critics threatened, the president and the city would “fall” together. Dolley wrote in a letter to a friend: “I am not the least alarmed at these things but entirely disgusted & determined to stay with him.” On August 17, 1814, a large British fleet dropped anchor at the mouth of the Patuxent River, only 35 miles from the nation’s capital. Aboard were 4,000 veteran troops under the command of a tough professional soldier, Maj. Gen. Robert Ross. They soon came ashore in Maryland without a shot being fired and began a slow, cautious advance on Washington. There was not a single trained American soldier in the vicinity to oppose them. All President Madison could do was call out thousands of militia. The commander of these jittery amateurs was Brig. Gen. William Winder, whom Madison had appointed largely because his uncle, the governor of Maryland, had already raised a sizable state militia. Winder’s incompetence became obvious, and more and more of Dolley’s friends urged her to flee the city. By now thousands of Washingtonians were crowding the roads. But Dolley, whose determination to stay with her husband was unwavering, remained. She welcomed Madison’s decision to station 100 militiamen under the command of a regular Army colonel on the White House lawn. Not only was it a gesture of protection on his part, it was also a declaration that he and Dolley intended to stand their ground. The president then decided to join the 6,000 militiamen who were marching to confront the British in Maryland. Dolley was sure his presence would stiffen their resolve. After the president had ridden off, Dolley decided to show her own resolve by throwing a dinner party, on August 23. But after The National Intelligencer newspaper reported that the British had received 6,000 reinforcements, not a single invitee accepted her invitation. Dolley took to going up to the White House roof to scan the horizon with a spyglass, hoping to see evidence of an American victory. Meanwhile, Madison sent her two scribbled messages, written in quick succession on August 23. The first assured her that the British would easily be defeated; the second warned her to be ready to flee on a moment’s notice. Her husband had urged her, if the worst happened, to save the cabinet papers and every public document she could cram into her carriage. Late in the afternoon of August 23, Dolley began a letter to her sister Lucy, describing her situation. “My friends and acquaintances are all gone,” she wrote. The army colonel and his 100-man guard had also fled. But, she declared, “I am determined not to go myself until I see Mr. Madison safe.” She wanted to be at his side “as I hear of much hostility toward him...disaffection stalks around us.” She felt her presence might deter enemies ready to harm the president. At dawn the next day, after a mostly sleepless night, Dolley was back on the White House roof with her spyglass. Resuming her letter to Lucy at midday, she wrote that she had spent the morning “turning my spy glass in every direction and watching with unwearied anxiety, hoping to discern the approach of my dear husband and his friends.” Instead, all she saw was “groups of military wandering in all directions, as if there were a lack of arms, or of spirit to fight for their own firesides!” She was witnessing the disintegration of the army that was supposed to confront the British at nearby Bladensburg, Maryland. Although the boom of cannon was within earshot of the White House, the battle—five or so miles away at Bladensburg—remained beyond the range of Dolley’s spyglass, sparing her the sight of American militiamen fleeing the charging British infantry. President Madison retreated toward Washington, along with General Winder. At the White House, Dolley had packed a wagon with the red silk velvet draperies of the Oval Room, the silver service and the blue and gold Lowestoft china she had purchased for the state dining room. Resuming her letter to Lucy on that afternoon of the 24th, Dolley wrote: “Will you believe it, my sister? We have had a battle or skirmish...and I am still here within sound of the cannon!” Gamely, she ordered the table set for a dinner for the president and his staff, and insisted that the cook and his assistant begin preparing it. “Two messengers covered with dust” arrived from the battlefield, urging her to flee. Still she refused, determined to wait for her husband. She ordered the dinner to be served. She told the servants that if she were a man, she would post a cannon in every window of the White House and fight to the bitter end. The arrival of Maj. Charles Carroll, a close friend, finally changed Dolley’s mind. When he told her it was time to go, she glumly acquiesced. As they prepared to leave, according to John Pierre Sioussat, the Madison White House steward, Dolley noticed the Gilbert Stuart portrait of George Washington in the state dining room. She could not abandon it to the enemy, she told Carroll, to be mocked and desecrated. As he looked anxiously on, Dolley ordered servants to take down the painting, which was screwed to the wall. Informed they lacked the proper tools, Dolley told the servants to break the frame. (The president’s enslaved White House footman, Paul Jennings, later produced a vivid account of these events; see sidebar, p. 55.) About this time, two more friends—Jacob Barker, a wealthy ship owner, and Robert G. L. De Peyster—arrived at the White House to offer whatever help might be needed. Dolley would entrust the painting to the two men, saying they must conceal it from the British at all costs; they would transport the portrait to safety in a wagon. Meanwhile, with remarkable self-possession, she completed her letter to Lucy: “And now, dear sister, I must leave this house...where I shall be tomorrow, I cannot tell!” As Dolley headed for the door, according to an account she gave to her grandniece, Lucia B. Cutts, she spotted a copy of the Declaration of Independence in a display case; she put it into one of her suitcases. As Dolley and Carroll reached the front door, one of the president’s servants, a free African-American named Jim Smith, arrived from the battlefield on a horse covered in sweat. “Clear out! Clear out,” he shouted. The British were only a few miles away. Dolley and Carroll climbed into her carriage and were driven away to take refuge at his comfortable family mansion, Belle Vue, in nearby Georgetown. The British arrived in the nation’s capital a few hours later, as darkness fell. Admiral Cockburn and General Ross issued orders to burn the Capitol and the Library of Congress, then headed to the White House. According to Lt. James Scott, Cockburn’s aide-de-camp, they found the dinner Dolley had ordered still on the table in the dining room. “Several kinds of wine in handsome cut glass decanters sat on the sideboard,” Scott would later recall. The officers sampled some of the dishes and drank a toast to “Jemmy’s health.” Soldiers roamed the house, grabbing souvenirs. According to historian Anthony Pitch, in The Burning of Washington, one man strutted around with one of President Madison’s hats on his bayonet, boasting that he would parade it through the streets of London if they failed to capture “the little president.” Under Cockburn’s direction, 150 men smashed windows and piled White House furniture in the center of the various rooms. Outside, 50 of the marauders carrying poles with oil-soaked rags on the ends surrounded the house. At a signal from the admiral, men with torches ignited the rags, and the flaming poles were flung through the smashed windows like fiery spears. Within minutes, a huge conflagration soared into the night sky. Not far away, the Americans had set the Navy Yard on fire, destroying ships and warehouses full of am­munition and other materiel. For a time, it looked as if all Washington were ablaze. The next day, the British continued their depredations, burning the Treasury, the State and War departments and other public buildings. An arsenal on Greenleaf’s Point, about two miles south of the Capitol, exploded while the British were preparing to destroy it. Thirty men were killed and 45 were injured. Then a freak storm suddenly erupted, with high winds and violent thunder and lightning. The shaken British commanders soon retreated to their ships; the raid on the capital had ended. Meanwhile, Dolley had received a note from Madison urging her to join him in Virginia. By the time they were finally reunited there on the night of August 25, the 63-year-old president had barely slept in several days. But he was determined to return to Washington as soon as possible. He insisted that Dolley remain in Virginia until the city was safe. By August 27, the president had re-entered Washington. In a note written hastily the next day, he told his wife: “You cannot return too soon.” The words seem to convey not only Madison’s need for her companionship but also his recognition that she was a potent symbol of his presidency. On August 28, Dolley joined her husband in Washington. They stayed at the home of her sister Anna Payne Cutts, who had taken over the same house on F Street that the Madisons had occupied before moving to the White House. The sight of the ruined Capitol—and the charred, blackened shell of the White House—must have been almost unbearable for Dolley. For several days, according to friends, she was morose and tearful. A friend who saw President Madison at this time described him as “miserably shattered and woebegone. In short, he looks heartbroken.” Madison also felt betrayed by General Winder—as well as by his Secretary of War, John Armstrong, who would resign within weeks—and by the ragtag army that had been routed. He blamed the retreat on low morale, the result of all the insults and denunciations of “Mr. Madison’s War,” as the citizens of New England, the center of opposition, labeled the conflict. In the aftermath of the British rampage through the nation’s capital, many urged the president to move the government to a safer place. The Common Council of Philadelphia declared its readiness to provide housing and office space for both the president and Congress. Dolley fervently maintained that she and her husband—and Congress—should stay in Washington. The president agreed. He called for an emergency session of Congress to take place on September 19. Meanwhile, Dolley had persuaded the Federalist owner of a handsome brick dwelling on New York Avenue and 18th Street, known as the Octagon House, to let the Madisons use it as an official residence. She opened the social season there with a crowded reception on September 21. Dolley soon found unexpected support elsewhere in the country. The White House had become a popular national symbol. People reacted with outrage when they heard that the British had burned the mansion. Next came a groundswell of admiration as newspapers reported Dolley’s refusal to retreat and her rescue of George Washington’s portrait and perhaps also a copy of the Declaration of Independence. On September 1, President Madison issued a proclamation “exhorting all the good people” of the United States “to unite in their hearts and hands” in order “to chastise and expel the invader.” Madison’s former opponent for the presidency, DeWitt Clinton, said there was only one issue worth discussing now: Would the Americans fight back? On September 10, 1814, the Niles’ Weekly Register, a Baltimore paper with a national circulation, spoke for many. “The spirit of the nation is roused,” it editorialized. The British fleet sailed into the port of Baltimore three days later, on September 13, determined to batter Fort McHenry into submission—which would allow the British to seize harbor ships and to loot waterfront warehouses—and force the city to pay a ransom. Francis Scott Key, an American lawyer who had gone aboard a British flagship at the request of President Madison to negotiate the release of a doctor seized by a British landing party, was all but certain that the fort would surrender to a nightlong bombardment by the British. When Key saw the American flag still flying at sunrise, he scribbled a poem that began, “Oh say can you see by the dawn’s early light?” Within a few days, the words, set to the music of a popular song, were being sung all over Baltimore. Good news from more distant fronts also soon reached Washington. An American fleet on Lake Champlain won a surprise victory over a British armada on September 11, 1814. The discouraged British had fought a halfhearted battle there and retreated to Canada. In Florida, after a British fleet arrived in Pensacola Bay, an American Army commanded by Gen. Andrew Jackson seized Pensacola (under Spanish control since the late 1700s) in November 1814. Thus, the British were deprived of a place to disembark. President Madison cited these victories in a message to Congress. But the House of Representatives remained unmoved; it voted 79-37 to consider abandoning Washington. Still, Madison resisted. Dolley summoned all her social resources to persuade the congressmen to change their minds. At Octagon House, she presided over several scaled-down versions of her White House galas. For the next four months, Dolley and her allies lobbied the legislators as they continued to debate the proposal. Finally, both houses of Congress voted not only to stay in Washington but also to rebuild the Capitol and White House. The Madisons’ worries were by no means over. After the Massachusetts legislature called for a conference of the five New England states to meet in Hartford, Connecticut, in December 1814, rumors swept the nation that the Yankees were going to secede or, at the very least, demand a semi-independence that could spell the end of the Union. A delegate leaked a “scoop” to the press: President Madison would resign. Meanwhile, 8,000 British forces had landed in New Orleans and clashed with General Jackson’s troops. If they captured the city, they would control the Mississippi River Valley. In Hartford, the disunion convention dispatched delegates to Washington to confront the president. On the other side of the Atlantic, the British were making outrageous demands of American envoys, headed by Treasury Secretary Albert Gallatin, aimed at reducing the United States to subservience. “The prospect of peace appears to get darker and darker,” Dolley wrote to Gallatin’s wife, Hannah, on December 26. On January 14, 1815, a profoundly worried Dolley wrote again to Hannah: “The fate of N Orleans will be known today—on which so much depends.” She was wrong. The rest of January trickled away with no news from New Orleans. Meanwhile, the delegates from the Hartford Convention reached Washington. They were no longer proposing secession, but they wanted amendments to the Constitution restricting the president’s power, and they vowed to call another convention in June if the war continued. There was little doubt that this second session would recommend secession. Federalists and others predicted New Orleans would be lost; there were calls for Madison’s impeachment. On Saturday, February 4, a messenger reached Washington with a letter from General Jackson reporting that he and his men had routed the British veterans, killing and wounding about 2,100 of them with a loss of only 7. New Orleans—and the Mississippi River—would remain in American hands! As night fell and the news swept through the nation’s capital, thousands of cheering celebrants marched along the streets carrying candles and torches. Dolley placed candles in every window of Octagon House. In the tumult, the Hartford Convention delegates stole out of town, never to be heard from again. Ten days later, on February 14, came even more astonishing news: Henry Carroll, secretary to the American peace delegation, had returned from Ghent, Belgium. A buoyant Dolley urged her friends to attend a reception that evening. When they arrived, they were told that Carroll had brought a draft of a peace treaty; the president was upstairs in his study, discussing it with his cabinet. The house was jammed with representatives and senators from both parties. A reporter from The National Intelligencer marveled at the way these political adversaries were congratulating each other, thanks to the warmth of Dolley’s smile and rising hopes that the war was over. “No one... who beheld the radiance of joy which lighted up her countenance,” the reporter wrote, could doubt “that all uncertainty was at an end.” This was a good deal less than true. In fact, the president had been less than thrilled by Carroll’s document, which offered little more than an end to the fighting and dying. But he decided that accepting it on the heels of the news from New Orleans would make Americans feel they had won a second war of independence. Dolley had shrewdly stationed her cousin, Sally Coles, outside the room where the president was making up his mind. When the door opened and Sally saw smiles on every face, she rushed to the head of the stairs and cried: “Peace, Peace.” Octagon House exploded with joy. People rushed to embrace and congratulate Dolley. The butler began filling every wineglass in sight. Even the servants were invited to drink, and according to one account, would take two days to recover from the celebration. Overnight, James Madison had gone from being a potentially impeachable president to a national hero, thanks to Gen. Andrew Jackson’s—and Dolley Madison’s—resolve. Demobilized soldiers were soon marching past Octagon House. Dolley stood on the steps beside her husband, accepting their salutes. Adapted from The Intimate Lives of the Founding Fathers by Thomas Fleming. Copyright © 2009. With the permission of the publisher, Smithsonian Books, an imprint of HarperCollins Publishers.
866a15ee5def476790bfc2a1d6a79bcf
https://www.smithsonianmag.com/history/how-epidemics-past-forced-americans-promote-health-ended-up-improving-life-this-country-180974555/?utm_source=nextdraft&utm_medium=email
How Epidemics of the Past Changed the Way Americans Lived
How Epidemics of the Past Changed the Way Americans Lived At the end of the 19th century, one in seven people around the world had died of tuberculosis, and the disease ranked as the third leading cause of death in the United States. While physicians had begun to accept German physician Robert Koch’s scientific confirmation that TB was caused by bacteria, this understanding was slow to catch on among the general public, and most people gave little attention to the behaviors that contributed to disease transmission. They didn’t understand that things they did could make them sick. In his book, Pulmonary Tuberculosis: Its Modern Prophylaxis and the Treatment in Special Institutions and at Home, S. Adolphus Knopf, an early TB specialist who practiced medicine in New York, wrote that he had once observed several of his patients sipping from the same glass as other passengers on a train, even as “they coughed and expectorated a good deal.” It was common for family members, or even strangers, to share a drinking cup. With Knopf’s guidance, in the 1890s the New York City Health Department launched a massive campaign to educate the public and reduce transmission. The “War on Tuberculosis” public health campaign discouraged cup-sharing and prompted states to ban spitting inside public buildings and transit and on sidewalks and other outdoor spaces—instead encouraging the use of special spittoons, to be carefully cleaned on a regular basis. Before long, spitting in public spaces came to be considered uncouth, and swigging from shared bottles was frowned upon as well. These changes in public behavior helped successfully reduce the prevalence of tuberculosis. In the 19th century, city streets in the U.S. overflowed with filth. People tossed their discarded newspapers, food scraps, and other trash out their windows onto the streets below. The plentiful horses pulling streetcars and delivery carts contributed to the squalor, as each one dropped over a quart of urine and pounds of manure every day. When a horse died, it became a different kind of hazard. In “Portrait of an Unhealthy City,” Columbia University professor David Rosner writes that since horses are so heavy, when one died in New York City, “its carcass would be left to rot until it had disintegrated enough for someone to pick up the pieces. Children would play with dead horses lying on the streets.” More than 15,000 horse carcasses were collected and removed from New York streets in 1880. Human waste was a problem, too. Many people emptied chamber pots out their windows. Those in tenement housing did not have their own facilities, but had 25 to 30 people sharing a single outhouse. These privies frequently overflowed until workers known as “night soil men” arrived to haul away the dripping barrels of feces, only to dump them into the nearby harbor. As civic and health leaders began to understand that the frequent outbreaks of tuberculosis, typhoid and cholera that ravaged their cities were connected to the garbage, cities began setting up organized systems for disposing of human urine and feces. Improvements in technology helped the process along. Officials began introducing sand filtration and chlorination systems to clean up municipal water supplies. Indoor toilets were slow to catch on, due to cost, issues with controlling the stench, and the need for a plumbing system. Following Thomas Crapper’s improved model in 1891, water closets became popular, first among the wealthy, and then among the middle-class. Plumbing and sewage systems, paired with tenement house reform, helped remove excrement from the public streets. Disease radically improved aspects of American culture, too. As physicians came to believe that good ventilation and fresh air could combat illness, builders started adding porches and windows to houses. Real estate investors used the trend to market migration to the West, prompting Eastern physicians to convince consumptives and their families to move thousands of miles from crowded, muggy Eastern cities to the dry air and sunshine in places like Los Angeles and Colorado Springs. The ploy was so influential that in 1872, approximately one-third of Colorado’s population had tuberculosis, having moved to the territory seeking better health. Some of this sentiment continues today. While we know that sunshine doesn’t kill bacteria, good ventilation and time spent outside does benefit children and adults by promoting physical activity and improving spirits—and access to outdoor spaces and parks still entices homebuyers. This fresh-air “cure” also eventually incited the study of climate as a formal science, as people began to chart temperature, barometric pressure and other weather patterns in hopes of identifying the “ideal” conditions for treating disease. Epidemics of the past established an ethos of altruism in the U.S. During the 1793 yellow fever epidemic, Philadelphians selflessly stepped up to save their city. With no formal crisis plan, Mayor Matthew Clarkson turned to volunteers collect clothing, food and monetary donations; to pitch a makeshift hospital; and to build a home for 191 children temporarily or permanently orphaned by the epidemic. Members of the Free African Society, an institution run by and for the city’s black population, were particularly altruistic, providing two-thirds of the hospital staff, transporting and burying the dead and performing numerous other medical tasks. A 20th-century diphtheria outbreak in a small region in the Alaska Territory inspired a national rally of support—and created the Iditarod, the famous dog sled race. When cases of “the children’s disease” began to mount in Nome, Alaska, in January 1925, the town was in trouble. Diphtheria bacteria produces a toxin, making it especially deadly, unless the antitoxin serum is administered. This serum had been readily available for decades, but Nome’s supply had run short, and the town was inaccessible by road or sea in the winter. Leaping into action, 20 of the area’s finest dogsled teams and mushers carried a supply of the serum all the way from Fairbanks—674 miles—in record time, facing temperatures of more than 60 degrees below zero. Their delivery on February 2nd, plus a second shipment a week later, successfully halted the epidemic, saving Nome’s children from suffocation. Newspapers across the country covered the rescue. It was also memorialized in movies (including the animated Balto), with a Central Park statue—and, most notably, with the annual Iditarod race. The significant challenges of delivery by dogsled also sparked investigation into the possibilities of medical transport by airplane, which takes place all the time in remote areas today but was still in its infancy at the time. Diseases fueled the growth of fundraising strategies. The polio epidemic of 1952 sickened more than 57,000 people across the United States, causing 21,269 cases of paralysis. The situation became so dire that at one point, the Sister Kenny Institute in Minneapolis, a premier polio treatment facility, temporarily ran out of cribs for babies with the disease. In response, the National Foundation of Infantile Paralysis (NFIP), which had been founded in 1938 by President Franklin D. Roosevelt and later came to be known as the March of Dimes, distributed around $25 million through its local chapters. It provided iron lungs, rocking chairs, beds and other equipment to medical facilities, and assigned physicians, nurses, physical therapists, and medical social workers where they were needed. The March of Dimes success has served as the gold standard in public health education and fundraising since its heyday in the 1940s and 1950s. Public health emergencies have inspired innovations in education. Starting in 1910, Thomas Edison’s lab, which had invented one of the first motion picture devices in the 1890s, partnered with anti-TB activists to produce short films on tuberculosis prevention and transmission—some of the first educational movies. Screened in public places in rural areas, the TB movies were also the first films—of any type—that viewers had ever seen. The anti-tuberculosis crusade was also a model for later NFIP efforts to combat polio that relentlessly put that disease at the front of public agenda until an effective vaccination was developed and implemented, and set a standard for future public health campaigns. Past epidemics fueled the growth of civic debate and journalism in the U.S., too. As far back as colonial times, newspapers built their audiences by providing an outlet for debate on controversial issues, including disease. Founders of the New England Courant—the first paper in Colonial America to print the voices and perspectives of the colonists—launched their paper as a vehicle to oppose smallpox inoculation during the 1721 Boston epidemic. As smallpox ravaged the city, a Boston doctor named Zabdiel Boylston began using inoculation, a practice in which people are intentionally infected with a disease, to produce milder cases and reduce mortality risk. Backed by those opposed to the practice, James Franklin started the Courant to serve as a tool to fight it. Inoculation’s success was demonstrated in 1721 and later smallpox epidemics, eventually convincing even staunch opponents of its value—but by inspiring an outlet to air their concerns, the anti-inoculation camp had made an important contribution to public discourse.
e9433863e935e8dca3600b77be320a93
https://www.smithsonianmag.com/history/how-first-lady-sarah-polk-set-model-conservative-female-power-180971393/
How First Lady Sarah Polk Set a Model for Conservative Female Power
How First Lady Sarah Polk Set a Model for Conservative Female Power In July 1848, as hundreds of women suffragists gathered in Seneca Falls to demand the right to vote and assert their right to participate in the public sphere, one prominent woman in Washington, D.C., was busy shaping the nation’s policy and guiding its direction at the highest level of government. Unfortunately for the activists, she didn’t share their politics. First Lady Sarah Polk formed half of an unusual political partnership with her husband, President James Polk, during his sole term in office from 1845 to 1849. Despite his brief time in office, Polk had an outsized influence on American history, particularly with regard to the Mexican-American War. As president, Polk sought his wife’s counsel on decisions, relied on her smart politicking and benefited from her popularity. Her active role in his presidency made her the most powerful woman of the era, asserts Amy S. Greenberg, professor of history and women’s studies at Pennsylvania State University and author of the new book Lady First: The World of First Lady Sarah Polk. Religious and conservative, Polk didn’t support the suffragists’ campaign; she had no need for what they sought. Polk had leveraged her privileges as a white, wealthy, childless and educated woman to become “the first openly political First Lady, in a period when the role of women was strictly circumscribed,” explains Greenberg, whose book hits shelves amidst a wave of feminist political activism. 131 women were sworn into Congress this January and the race for the Democratic Party nominee for the 2020 presidential election features multiple women candidates. It’s with some irony, then, that this first breakthrough in national politics would come from Polk, a figure who viewed women as subservient to men, owned slaves, created a false, populist persona and would post-White House be a stalwart supporter of the Confederacy. Over 170 years after Polk left Washington, Greenberg writes, “she set a model of conservative female power that grew and flourished in the century after her death, and which actively shapes our current political moment. Phyllis Schlafly, Nancy Reagan, and Ivanka Trump: all are political heirs of Mrs. James K. Polk.” Smithsonian spoke with Greenberg about the First Lady’s life and legacy. The little-known story of remarkable First Lady Sarah Polk--a brilliant master of the art of high politics and a crucial but unrecognized figure in the history of American feminism. Sarah Polk was the most powerful woman in the United States in the middle of the 19th century. How did she come by that power? How did that power manifest itself? Her power would not have been possible without her reliance on the power of the men around her. We have this idea that before women got suffrage, women were not political actors. But, here's a woman who was, in many ways, super conservative. She didn't support women's rights, and she was surrounded by men who would say, generally, that they did not think that women were deserving of having the vote. She became powerful by being the exception to the rule. It was a rule that even she believed in, which was that politics was really something for men, not for women. The other super important thing is that her husband, the president, relied on her to help him. He really pushed her to be more politically involved than she might have been otherwise. They figured out early, I think, in the relationship that they weren't going to have kids. He said to her, “Look, why would you just stay at home like these other wives do? Why don't you accompany me on my travels and help me with my political work? Read all these newspapers and tell me what you think about them.” Either because he didn’t want her to be lonely, or because he perceived that this was something that was going to help him. What did her partnership with her husband look like? President Polk was super unlikeable. From early on in his career, politicians around him found that they were better off communicating with James through Sarah. I found records of when she was in the White House where politicians would come to the White House and they were coming deliberately to meet with her. She also was James’ communications director. There's all these really remarkable letters where men are writing to James, but they'll say in the letter, “If Mrs. Polk is reading this, then please convey so and so.” While James was in the White House, he was also sick often. So, she held receptions without him, or he was too busy to hold the receptions. She became the means by which James was able to accomplish all of this stuff during his one term, even though nobody liked him and people, basically, didn't trust him. It seems to me that Polk couldn’t have successfully prosecuted a war against Mexico without her lobbying other politicians on his behalf. Why was she so popular among Americans? There hadn't been a beloved figure in the White House since Dolley Madison. Sarah was just immediately popular because she was extremely pious. She did a really good job pretending to be down to earth. During this time period, her party, the Democrats, were supposed to be the party of the common man. Sarah just did an amazing job presenting herself as a first lady for [the people], which she did by emphasizing her religiosity. She kept the Sabbath, which, oh my God, people loved that about her. Everything about her appearance seemed really modest. She was very, very good at manipulating her public persona with the press by making sure that stories were printed about her work with the poor. One of my favorite early anecdotes about Sarah was that Congress allotted a tremendous amount of money for remodeling the White House, which was in serious disrepair. But Sarah let it be known that she was not an extravagant person, and so she would only take half the amount of money allotted; people thought that this was fantastic. The reality was she was super extravagant. Personally, she spent a ridiculous amounts of money on her clothes. She wasn't interested in remodeling the White House because she would rather spend her time lobbying politicians and reading newspapers. But [the news reports] made the public think, “Oh, well we have this, actually, thrifty person. That's so fantastic.” How did she negotiate between the masculine and feminine spheres of the era? In a time period when the vast majority of the public believed women were only suited for the private sphere—life within the home, taking care of children, making the house beautiful and being pious—Sarah managed to amass power. She never presented her opinions as her own opinions. She always presented herself as representing her husband. She was able to amass and exercise political power by saying to men, “Well, Mr. Polk thinks this, or that.” Or, “This is really what Mr. Polk would like to have done.” She was so good at presenting herself as deferential to the beliefs of the men that she talked to, so they knew she wasn't trying to challenge them. She worked within their system and could be an aid to them in this way. She never challenged men, even on minor points. She always represented herself as submissive, and above all deferential. This allowed her to move back and forth between the world of women and men in a way that other women weren't able to. Although Sarah enjoyed her political power, she didn’t support pathways like suffrage for other women to gain power. Why not? I think it’s safe to say that she didn’t support suffrage because on some level, she just didn’t need it. She found a way to gain her political power without suffrage. In a way, there’s a hypocritical aspect to her personality, which is that she’s perfectly fine with not allowing other women the rights that she, herself, has. If you wanted to be more generous, you could say, “Well, she didn't support suffrage because she was coming out of this extremely conservative, religiously based mindset whereby hierarchy is enshrined in the Bible.” She’s a huge supporter of slavery, and she believes that the Bible says wives be subservient to their husbands and that black people be subservient to white people. In this time period, a lot of rich, white women out there figure out that their class position is allowing them to operate in ways that our historical narrative doesn't tell us about, which is that they're able to be really powerful because they’re rich, because they’re white, and because they’re surrounded by men who acknowledge their right to exert influence in the political arena. What role did Sarah play in championing “Manifest Destiny” and the war with Mexico? Sarah grew up in a household where the family became wealthy by moving onto the land that was taken from Native Americans, and then farming and growing cotton on that land with slaves. She grew up believing that the way to wealth was through moving west, because this is what her family had done. She supported Manifest Destiny from the beginning, as did her husband who grew up in a similar situation. [During the presidential campaign,] James Polk was the most explicit about claiming that God had chosen the people of the United States to expand across the continent. While other Democrats were more restrained, about the idea of Mexico being entitled to the land that they owned, or even Great Britain having some rights on the continent, James was really out in front and saying, “No. No, America’s destiny is to occupy all of the lands that are currently being occupied by these less deserving people.” Those were Sarah's views, too. She maintained until the end of her life that one of the greatest achievements in American history was the war that her husband had directed against Mexico because it led to annexing California, Nevada and most of Arizona to the United States. When she was in the White House she was very careful to make sure that veterans of the Mexican-American war were invited to parties and shown particular respect. While the U.S. was fighting Mexico, she had extra evening receptions at the White House, complete with military music, preferably with veterans in attendance, where she could lobby different members of Congress to continue supporting the war. Sarah and James owned dozens of slaves. Can you talk about her time as a plantation owner? When James ran for president, he had to conform to the views of many Americans, especially Americans who lived in the North, that slavery was not necessarily an ideal system. He maintained that he never bought or sold slaves, except to keep families together. To the extent that was true, it was only true because of Sarah. When she married James, she insisted that the slaves she had inherited from her father be allowed to stay with family members, and she wouldn't let any of them be sold away from the family. After James died, she became the sole owner of their cotton plantation which James had bought and stocked with very young slaves, despite his claims that he wasn't buying and selling slaves. With Sarah’s help, he was buying all sorts of young people, taking them away from their families and sending them to Mississippi, which was absolutely the worst place to be a slave in the United States. The work was back-breaking, and all these people had been taken away from their families. Sarah had a relationship to her slave property that could best be described as paternalistic. She was invested in this view that she was a “good” slave owner. Of course, in reality, she wasn't a good slave owner because she was holding these people in bondage. Throughout the 1850s, she managed this cotton plantation herself, which forced her to come to terms with the fact that there was no such thing as being a beneficent slave owner. She ended up selling slaves away from the plantation, despite her claim that she would never do such a thing. Then right before the Civil War, she sold a half-interest in the plantation and made a tremendous amount of money by basically selling slaves en masse. When the Civil War started, Sarah was a widow living in Tennessee. How did she behave during the conflict? She remains in her house throughout the Civil War in Nashville because her husband's grave is there. She says she'll never leave it, so she stays when a lot of other wealthy and powerful Confederates leave. Sarah manages this remarkable trick, which is to claim that her house is neutral territory, that she, herself, was neutral and that she was entitled to be treated with respect by everybody because she was a First Lady. Her husband had given his life to the Union, and so she needed to be treated not only with respect, but actually to get special favors from the Union army. All of these Union generals really don't trust her and believe she's actually a dyed-in-the-wool, hardcore Confederate, which I think basically she is. They have to do what she wants because she is First Lady Sarah Polk, and she manages to actually pull this one over. While all these Union generals are treating her with respect and allowing her to travel around and to sell cotton, despite the ban on Confederates selling cotton, Sarah is secretly working on behalf of the Confederacy. She’s not a spy, but she’s hiding valuable confederate property in the house for people who aren't as well-situated as her, sending money on behalf of imprisoned Confederates, and asking for special treatment of and leniency for Confederate soldiers. She spends the entire Civil War using her power to help the Confederacy. What was Sarah Polk’s lasting influence? Sarah Polk left a legacy that we still see today of conservative women who pretend to be deferential to men and use that pretense to actually amass and exercise power. I see her as the beginning of an American tradition of conservative women who, because of their wealth, political connections and power, are perfectly happy exercising rights that they're not necessarily willing to extend to other people. Anna Diamond is the former assistant editor for Smithsonian magazine.
b8a18e74cefde94d3fb8a387013c7851
https://www.smithsonianmag.com/history/how-halloween-has-taken-over-england-180953211/
How Halloween Has Taken Over England
How Halloween Has Taken Over England In England, Halloween is so hot right now. And what's making it more unbearable for some is the fact that the Americanized celebration of Halloween that is becoming more and more popular on October 31 may be coming at the expense of the most staunchly English (although equally insubordinate) of holidays: Guy Fawkes Day on November 5. That holiday, also known as Bonfire Night, is a commemoration of the foiled Gunpowder Plot by disgruntled Catholics to blow up Parliament, with the Protestant King James I inside. Celebrated like the the Fourth of July, fireworks, parades, blazing bonfires, and effigies of Fawkes (and the Pope), were all typical trademarks of the holiday. But increasingly, revelers in the United Kingdom are combining the holidays and what has long been a distinctly British event has taken on more and more of an American flavor. "I have a distinct sense that Halloween is overtaking or has overtaken Guy Fawkes Night," says James Sharpe of the University of York in England, who has studied the history of these holidays. Some data and much anecdotal evidence back this up: In an article last year on Halloween in the U.K., the New York Times reported that sales of Halloween-related products were expected to grow 12 percent in 2013 from the previous year. Halloween dress-up balls and parties are becoming popular with young Brits, just as they have been with their American counterparts. Trick or treat candies are collected along with pennies for the Guy. Houses and shops are decorated with images of witches, pumpkins and Michael Myers—even pets are dressed in silly Halloween costumes. "It's certainly true that Halloween is now a 'thing' in the U.K., in a way that wasn't true when I was a child," says Dr. Susan Greenberg, senior lecturer in creative writing at London's University of Roehampton, and a dual national who has lived in the U.K. since childhood. Some Brits are not happy to see Guy Fawkes Day being eclipsed by Halloween. Sharpe, for one, proudly considers himself a "Halloween Scrooge," and says that, in his opinion, the Americanized way the holiday is being marked in England is "rather brainless." Who’s to blame? "I hate to say this, but what's happening is a result of U.S. cultural imperialism," Sharpe says, citing a national poll in the U.K., conducted by the market research firm YouGov, in which forty five percent of those surveyed  thought Halloween "an unwelcome American cultural import." (Presumably the other fifty-five were busy celebrating it). Some might consider the idea of dismissing Halloween as an American intrusion into British culture ironic considering that its roots are found in Scotland and Ireland. Then again, nobody was walking around dressed up as a banana in 12th-century Scotland. Nicholas Rogers, author of the book Halloween: From Pagan Ritual to Party Night sees the Halloween-Guy Fawkes competition differently. "I know some in England want to paint it as cultural imperialism," says Rogers, a native of Bristol, who teaches history at York University in Toronto. But, he points out, it is the British who have changed as changed as much as the holidays they celebrate. "In a more multicultural Britain, Guy Fawkes is a bit of an embarrassment," Rogers says. "What you’re doing is burning a Catholic on a bonfire, and that doesn't go down very well today." The actual history of the Gunpowder Plot (or the Powder Treason as it was also known) has also undergone some re-evaluation. "The courage of the Powder Plotters is undeniable and even those hottest in condemning their enterprise have paid tribute to it," wrote historian Antonia Fraser in her acclaimed 1996 book on the Plot, Faith and Treason. Guy Fawkes and his co-conspirators may have very well been what we would today call terrorists, but given the oppression of Catholics in England at the time, Fraser argues, they were "perhaps brave, misguided men...whose motives if not their actions, were noble and idealistic." While the holiday in his name may be declining in popularity, Fawkes himself has enjoyed a career comeback as a symbol for protest in the 21st century:  the 2006 movie "V for Vendetta," in which the eponymous hero, the anarchist V, wears a Guy Fawkes mask in his efforts to overthrow a fascist British government in a dystopian future, Fawkes's visage has become the unofficial face of the Occupy movement and the hacker group Anonymous. Halloween labors under no such political baggage. While the celebrations in Britain do owe a good deal to the American version of the holiday, Rogers notes that Halloween here in the U.S. continues to evolve, too, reflecting our own changing society; accommodating the rites and traditions of other seasonal festivals, including the Day of the Dead, a Mexican holiday celebrated from October 31-November 2. "In cities like San Antonio and Los Angeles," Rogers says, "You've now got a fused holiday. You've got sugar skulls, a traditional Day of the Dead Mexican treat, co-existing with people dressed up as witches." Similarly, he suspects Halloween and Guy Fawkes Day may find a way to coexist in Britain. In some parts of Northern Ireland and Canada, they've already managed to dampen the anti-Catholic undertones while keeping the fires burning on November 5. Celebrants there have simply taken Guy Fawkes, in name and effigy, out of the holiday. "They have a Guy-less bonfire," Rogers says dryly. It's doubtful that in a country with a large Catholic population, Americans would appropriate Guy Fawkes Day as a holiday of their own, even though in pre-Revolutionary War Boston, it was actually celebrated as "Pope's Day" with effigies of the Pope joining Fawkes as objects of desecration. That's just as well. Besides being offensive, one thing colonial Pope's Day shared with American Halloween and the British Guy Fawkes Day is that all are marked by a degree of bad behavior on the part of some.  In her book, Fraser quotes what she calls the "sensible" words of an American almanac on the subject in 1746: Powder Plot will not be forgot. Twill be observed by many a sot. John Hanc is a writer for Smithsonian, The New York Times, Newsday and Runner's World. He teaches journalism at the New York Institute of Technology in Old Westbury. Hanc’s 15th book—the memoir of Dr. Arun Singh, a cardiac surgeon who has performed more open heart surgeries than almost anyone in history—will be published in 2018 by Center Street, an imprint of Hachette.
4da33ea4eefd435a2d222887fb5fa840
https://www.smithsonianmag.com/history/how-historic-preservation-shaped-early-united-states-180974871/
How Historic Preservation Shaped the Early United States
How Historic Preservation Shaped the Early United States In the middle of the 19th century, the homes of two founding fathers, John Hancock and George Washington, were in danger of being torn down. For the Massachusetts patriot with the famous signature, it was his house just off of Boston Common in the city’s urban center. For the nation’s first president, it was his rural Virginia estate, Mount Vernon. The press covered the potential destruction of the two sites with horror, and according to historian Whitney Martinko, the divergent fates of these homes encapsulates the history of historic preservation in the United States. While the Mount Vernon Ladies Association raised funds to purchase the president’s mansion from his nephew, and continue to own and operate the property today, Hancock’s home was sold and torn down to construct new residences. “What did it mean about the United States if its citizens were most interested in how much money they could garner from developing any land available?,” asks Martinko. Her new book, Historic Real Estate: Market Morality and the Politics of Preservation in the Early United States, examines this question, among many others, in a fascinating exploration of how Americans grappled with preserving their past (or not) amid economic booms and busts. From its earliest years as a nation, the country’s government and its citizens battled over the costs and benefits of historic preservation, at times grounded in surprisingly progressive beliefs about whose history deserved to be protected. Martinko spoke with Smithsonian about the themes of her book and the history of historic preservation in the United Sates. In Historic Real Estate, Whitney Martinko shows how Americans in the fledgling United States pointed to evidence of the past in the world around them and debated whether, and how, to preserve historic structures as permanent features of the new nation's landscape. Let’s start with the most obvious question—what exactly is historic preservation? Historic preservation is the practice of thinking through how to manage historic resources, and can include things like cemeteries, whole neighborhoods, farms or infrastructure. It encompasses the creation of places like historic house museums that are open to the public, but it also includes places like private homes for individuals who want to keep the historic character of their residence, or business owners who might want to inhabit a historic building, but want to also make use of it through adaptive reuse. It could be as simple as doing some research into the history of a house by looking at things like census records, old deeds and also looking at maybe physical clues of the house’s past. So you might chip away paint layers on your walls and say, "Oh we found some old paint. We want to try to keep that original character intact." On the local level, historic preservation might also involve writing a nomination for the local historic register. For instance, I live in Philadelphia; there's a local register of historic places that is managed by the city’s historical commission. And those exist all over the United States. What makes the history of “preservation” so compelling? We might think historic preservation is about stopping time, freezing something in the past. But in fact, historic preservation today, as well as in the past, has always been about managing change. In the first half of the 19th century, people in the early United States were focused on the future and about managing change in a modern nation. The history of historic preservation also helps us appreciate what has been preserved. Independence Hall has been preserved, Mount Vernon, and a lot of our national iconic sites, as well as local sites—we should understand them in the context of what was demolished. Preserved historic sites are the result of choices that were made continually to keep these buildings in place. Looking at the history of historic preservation helps us to see how people made these decisions, and how those decisions reflected debates about broader social and economic values. What were those values for Americans in the first decades of the United States, between the Revolution and the Civil War? The residents of the early nation tried to work out a very practical, tangible solution to a central issue that they faced then and that we face today: the relationship between pursuing private profit versus the public good. This question took on new importance to people living through the Revolutionary Era, because that project of nation building sparked debates about what would be the guiding values of the United States. Some argued that preserving historic structures was a public good, others that private economic gain—which might mean demolition—was also in the public interest. This debate continues to shape preservation and larger discussions about private versus public interests today. Who gets to decide what is preserved? Historic sites are really interesting because they became a flashpoint. The property owner might want to do one thing, and maybe other citizens in the community wanted to do another, and they’re making claims that this church, or this historic house, or this cemetery really belonged to the entire community. Or that the site carried historic significance for people beyond the property owner. And so these are the debates that I'm really interested in my book. Preservation forced people to make decisions about what private ownership really looked like and whose voices mattered when considering the fate of sites that people thought were historic. What is it about preservation in the early United States that's different and important? The usual history of historic preservation in America often starts with the founding of the Mount Vernon Ladies Association in the 1850s, a moment in the United States we might have called the birth of preservation. The Colonial Revival comes after this, later in the 19th century and early-20th century, where there's interest in either preserving sites from colonial history or making replicas of colonial era objects and homes. The unsuccessful fight to save Penn Station in New York in the early 1960s is also a moment people look to as an important grassroots effort. And of course, federal legislation in the 1960s, the National Historic Preservation Act of 1966 set up the National Register of Historic Places. But the era before 1850 has been overlooked in the context of historic preservation. Many people living in the new nation were engaging in debates over how to keep historic sites. Americans were trying to find tangible solutions to defining the economic and social values of the early United States. Can corporations serve the public good? Or are they only a vehicle for the private interest? A lot of historic churches and city sites were owned by corporations, so Americans saw the fate of these sites as an answer to these larger questions. Early Americans debated the preservation of historic structures to answer similar questions about the nature of commercial profits and real estate speculation. John Hancock’s house in Boston and George Washington’s estate at Mount Vernon raised these issues. While one was in the heart of Boston and one was along the Potomac in rural Virginia, in both cases, real estate developers were interested in them as investments, which made people really upset. One rumor was that John Washington, the nephew of George Washington, was going turn Mount Vernon into a hotel or even a factory site. A similar reaction arose in Boston when developers bought Hancock’s house as a teardown to put in new homes. People wondered how someone could conceive of these properties as anything but sacred sites, that should be valued as monuments to the great men who lived in them. And others understood their value as commercial real estate. The Mount Vernon Ladies Association formed and purchased George Washington’s home, and has preserved it to this day. But in 1863 John Hancock's house met a different fate; it became the site of new townhouses. How did the drive for historic preservation mesh with the drive for Westward Expansion? In the 1780s, a number of men moved from Massachusetts into the Ohio Valley and planned the town of what became Marietta, Ohio. They decided that they want to legislate the preservation of what they called Monuments of Antiquity, indigenous earthworks built in the Ohio River Valley. They saw these as elements of the built environment and ed them evidence of what they would call human civilization, or in this case, American civilization. Architecture is one of the ways that early Americans thought about the development of history. They thought that you could chart the rise of civilization, in their words, by looking at the material products of particular people at different times. So they saw earthworks as evidence of those who came before them--what they called ancient America. Similarly, they saw colonial mansions built in the 17th century or early 18th century as evidence of the state of society in the colonial era and buildings constructed in the 19th century in the early U.S. as evidence of the state of society in the early United States. So rather than turning away from a colonial or indigenous past, residents of the early United States really embraced these older structures as evidence of what they would consider to be the progressive development of American civilization. And the United States was only the next step in that advancement. Did Native Americans have a role in their own version of preservation? Many residents of the early United States celebrated their idea of indigenous people in the past while denying living communities a place in the United States. U.S. migrants to the Ohio River Valley celebrated and preserved what they saw as ancient abandoned architecture while killing and removing Indigenous residents of the same region. A more complex case of Native Americans involved in debates over preservation, as opposed to being the objects of preservation, was that of Thomas Commuck, a Narrangasset man. Commuck had inherited a family farm near Charlestown, Rhode Island, that he wanted to sell to support his move from the Brothertown nation, then in New York State, to Wisconsin. The state of Rhode Island was supposed to be holding Narragansett lands in trust for the community, but was also trying to sell off parcels as private property, so they allowed Commuck to do so, too. But at the same time, other Narragansetts stayed in Rhode Island and were trying to keep their homes, their language, and their communities in place. What we see is really two different strategies among the Narrangansett for trying to maintain family and survive in the new United States. Thomas Commuck was trying to earn cash to start a new home in the West even as other Narragansetts were trying to preserve their homes in Rhode Island. The difference was that the people in power, the citizens of the state of Rhode Island, would not have recognized what the Narragansetts near Charlestown, Rhode Island, were doing as valuable preservation of the American past. How did other marginalized communities participate in debates about historic preservation? This is an area that really needs more research. One example I found is Peyton Stewart, a free African American living in Boston in the 1830s. He lived in and operated a secondhand clothing shop out of Benjamin Franklin's childhood home in Boston. We know he took an interest in the home’s historic features only because he talked with Edmund Quincy, the wealthy white abolitionist and son of Boston’s mayor, about it, and Quincy recorded that conversation in his diary. At one point, Stewart invited Quincy in to assess the home’s historic character and asked Quincy whether he should buy the building. This shows that Stewart was making enough money to consider purchasing property in Boston, and then he strategically asked a prominent abolitionist and antiquarian for his opinion about the house. Stewart was able to get the attention of a local, prominent Bostonian and build a relationship with him to show that he was, in Quincy’s terms, a "respectable citizen” because he was interested in preserving Boston’s past. This case shows the sparsity of evidence of voices like Stewart’s and the challenges of finding out about buildings that were not preserved. Despite Stewart’s and Quincy’s interest in the building, Benjamin Franklin's childhood home was eventually destroyed in the 1850s. What surprised you during your research? My real surprise was the wide variety of sites that gained attention. Many of these extraordinarily decrepit buildings were not beautiful and were a real contrast to what was considered as providing good living standards. I was also surprised by the national debate that erupted over Ashland, the home of Kentucky politician Henry Clay. When one of his sons, James B. Clay, bought Ashland from his father's estate and announced in the newspapers that he was going to preserve his father's home, everyone was very excited. And then he leveled the house to the ground. A great uproar occurred. And then he said, "No, no, I'm preserving my father's home. I'm building a new and better house on the same foundation." And so this elicited a great debate about what “preservation” of the home really meant. Were there any more modest buildings that were saved under the auspices of historical preservation? Maybe the most humble building that I wrote about in a bit of detail was an old cowshed that some men who were part of the Essex Institute in Salem, Massachusetts, had heard about in the 1860s. It was potentially built from timbers from the 17th-century First Church of Salem. So they went out and inspected this old cow shed and decided that it was definitely built from that first church. They reconstructed the church building, taking careful note of what they thought was the original material rescued from the cowshed, and what was filler material. And this reconstruction still stands on the grounds of the Peabody Essex Museum today. We might say, "Well, that's demolition. That's not preservation in the case of Ashland. Or, that's clearly not the first church of Salem; that's bad preservation." What my book tries to do is not judge what was good or bad preservation, or to try to apply the standards of today, but to take people in the past on their own terms when they said that they were engaging in preservation. And then to look carefully at the details of what they did to understand why they thought what they were doing was maintaining a meaningful connection to the past. Karin Wulf is executive director of the Omohundro Institute of American History & Culture and a professor of history at William & Mary.
0fbc5e0641c96a90b2fd7693fe8fb7a3
https://www.smithsonianmag.com/history/how-horace-greeley-invented-persona-crusading-journalist-180974348/
How Horace Greeley Turned Newspapers Legitimate and Saved the Media From Itself
How Horace Greeley Turned Newspapers Legitimate and Saved the Media From Itself December 3, 1840, a Thursday. A bank president in New Jersey goes missing in broad daylight, leaving his office in New Brunswick around 10 a.m. He is never again seen alive. Some say he’s gone to Texas, others say Europe. There are no leads, one way or another, for six days. Then, an impecunious carpenter is seen with a “handsome gold watch,” “unusually flush with money,” boasting of newfound liberation from his mortgage. The trail leads to his home, down the steps into his cellar, under hastily laid floorboards, and into the dirt beneath. There, in a shallow ditch, rests the lost banker, fully clothed, watch missing, skull split from a hatchet blow. Details of the story are familiar. We know them from Edgar Allan Poe’s 1843 gothic horror, “The Tell-Tale Heart,” in which a murderer is tormented by the ceaseless pounding of the victim’s heart he’s buried under his floor. Poe knew the story because he read newspapers. If you were alive, literate, or just vaguely sentient in New York or Philadelphia (where Poe lived) in 1840 and 1841, you probably knew the story, too. You knew it because cheap newspapers covered it in all its gory details for months—covered it with the relentless persistence of the beating heart beneath the floor in Poe’s tale. Daily papers needed readers to survive, after all, and murders—the more shocking, the more grisly, the better—brought readers. But there was one American editor who turned his gaze the other way, hoping to elevate rather than titillate. Horace Greeley thought he could fix American newspapers—a medium that had been transformed by the emergence of an urban popular journalism that was bold in its claims, sensational in its content, and, in Greeley’s estimation, utterly derelict in its responsibilities. As the trial for the bank manager’s murder wound to a close in April of 1841, with the killer sent up the gallows, Greeley was just launching the daily newspaper that would make him famous, the New-York Tribune. He should have flogged the New Brunswick case for all it was worth. But the Tribune referenced it just twice. First, Greeley printed a short editorial comment on the killer’s execution, but nothing more: no reporter on the scene, no bold-faced headlines referencing “Peter Robinson’s Last Moments,” “Breaking the Rope,” or “Terrible Excitement.” Then, two days later, Greeley let loose—not to revisit the killing or to meditate on the lessons of the hanging, but to excoriate the newspapers that had so avidly covered both. The coverage, he wrote, amounted to a “pestiferous, death-breathing history,” and the editors who produced it were as odious as the killer himself. “The guilt of murder may not stain their hands,” Greeley thundered, “but the fouler and more damning guilt of making murderers … rests upon their souls, and will rest there forever.” Greeley offered his Tribune, and crafted the editorial persona behind it, in response to the cheap dailies and the new urban scene that animated them. Newspapers, he argued, existed for the great work of “Intelligence”; they existed to inform, but also to instruct and uplift, and never to entertain. Greeley tumbled into New York City in 1831 as a 20-year-old printer. He came from a New England family that had lost its farm. Like thousands of other hayseeds arriving in New York, he was unprepared for what he found. With a population over 200,000, Gotham was a grotesquely magical boomtown. Riven by social and political strife, regular calamities and epidemics, and the breakneck pace of its own growth, it was a wild novelty in America. At least there was plenty of printing work to go around. The year after Greeley’s arrival, New York had 64 newspapers, 13 of them dailies. In many ways, though, the press was still catching up to the city’s fantastic new reality. The daily press was dominated by a small core of expensive six-cent “blanket sheets,” mercantile papers that were pitched to merchants’ interests, priced for merchants’ wallets, and sized—as much as five feet wide when spread out—for merchants’ desks. The rest of New York’s papers were weeklies and semiweeklies for particular political parties, reform movements, or literary interests. They tended to rise and fall like the tides at the city’s wharves. Newspapering was a tough business, but in 1833 a printer named Benjamin Day began to figure it out. Day’s New York Sun didn’t look or feel or read or sell like any daily paper in New York at the time. Hawked in the street by newsboys for just a penny, it was a tiny thing—just 7 5/8” x 10 1/4”—packed with stories that illuminated the city’s dark corners. Where newspapers had mostly shunned local reportage, Day and his reporters made the city’s jangling daily carnival ring out from tiny type and narrow columns. The formula was simple: “We newspaper people thrive on the calamities of others,” as Day said. And there was plenty of fodder, be it “fires, theatrical performances, elephants escaping from the circus, [or] women trampled by hogs.” And if accidents, or crime scenes, or police courts, or smoldering ruins offered up no compelling copy, the Sun manufactured it by other means. Take the summer of 1835, when the paper perpetrated the famous “moon hoax” with a series of faked articles about lunar life forms seen through a new telescope. That same year an itinerant editor named James Gordon Bennett launched his penny daily, the New York Herald. There, he perfected the model that Day had pioneered, largely by positioning himself as an all-knowing, all-seeing editorial persona. In 1836, as the Sun and the Herald dueled over coverage of a prostitute’s murder, Bennett fully made his name. His dispatches offered lurid descriptions gleaned from the crime scene, where he claimed access as “an editor on public duty”; his editorials took the bold—and likely false—stance that the prime suspect, a young clerk from an established Connecticut family, was innocent. The Herald soon surpassed the Sun in circulation, drawing in even respectable middle-class readers. The age of the newspaper had dawned, and Bennett crowned himself its champion. “Shakespeare is the great genius of drama, Scott of the novel, Milton and Byron of the poem,” he crowed, “and I mean to be the genius of the newspaper press.” Books, theater, even religion had all “had [their] day”; now, “a newspaper can send more souls to Heaven, and save more from Hell, than all the churches and chapels in New York—besides making money at the same time.” Greeley, a prudish latter-day New England Puritan, looked on in horror. Bennett and Day were making money, but they did so by destroying souls, not saving them. The penny press betrayed the great power of the newspaper to inform, and shirked the great burdens of the editor to instruct. The power of the press was being squandered in an unseemly contest for the lowest common denominator. These “tendencies,” Greeley recalled in 1841, “imperatively called for resistance and correction.” Resistance and correction found several expressions, beginning in 1834 with Greeley’s first paper, a “weekly journal of politics and intelligence” called the New-Yorker. There, Greeley promised to “interweave intelligence of a moral, practical, and instructive cast”; he promised to shun the “captivating claptraps” and “experiments on the gullibility of the public”; and he promised to do it all “without humbug.” There were problems with this approach, beginning with the fact that it didn’t pay. Greeley’s limited correspondence during the New-Yorker’s run between 1834 and 1841 reveals the editor continually at or near the financial drowning point. There wasn’t much of a market for instruction and elevation in print, even at $3 a year. “I essay too much to be useful and practical,” he told a friend. “There is nothing that loses people like instruction.” Instruction, if served at all, was best delivered in small doses, and with “sweetmeats and pepper sauce” to make it go down. And there was another problem: How much could a newspaper actually accomplish in correcting the sins of other newspapers? Printed content was like the paper money that was at the root of the era’s regular financial crises: there was too much of it, and no one quite knew what it was worth. The same week that Greeley debuted his New-Yorker, another city paper placed a mock want-ad seeking “a machine for reading newspapers,” one that could “sift the chaff from the wheat,” “the useful facts from idle fictions—the counterfeit coin from the unadulterated metal.” Still, Greeley persisted—certain that the world just needed the right editor and the right newspaper. He put forward the Tribune in 1841 with the assurance that he had found both. Here would be a “newspaper, in the higher sense of the term,” more suited to the “family fireside” than a Bowery barroom. Its columns would be expurgated—no “scoffing infidelity and moral putrefaction,” no “horrid medley of profanity, ribaldry, blasphemy, and indecency.” In their place would go “Intelligence,” Greeley’s notion of journalism as a vehicle not just for news, but for ideas, literature, criticism, and reform. The notion, like the uncouth, wispy-haired towhead himself, was an easy mark for Bennett, who took aim following Greeley’s sermon on the coverage of the New Jersey murder. “Horace Greeley is endeavoring, with tears in his eyes, to show that it is very naughty to publish reports of the trial, confessions, and execution,” Bennett wrote. “No doubt he thinks it’s equally naughty in us to publish a paper at all.” By Bennett’s lights, Greeley’s priggish objections came from his rural roots: “Galvanize a New England squash, and it would make as capable an editor as Horace.” Greeley was simply not up to the work of urban journalism. But Greeley was shrewder than Bennett thought. True, he’d never quite shaken off the dust of the countryside, but that was by choice. Greeley used Bennett’s editorial showmanship as a foil to create his own journalistic persona—setting himself up as a newsprint version of a stock folk figure of the day: the wise country Yankee sizing up a world in flux. Bennett, the savvy urbanite, was the herald telling the city’s dark secrets; Greeley, the rustic intellectual oddball, was the tribune railing against them. There was room for both. Greeley’s Tribune and Greeley the tribune would rise together over the next 30 years, paper and person often indistinguishable. The Tribune would never be the newsgathering operation that Bennett’s Herald was, nor would it match the Herald’s circulation in New York City itself. Instead, Greeley would use the city as a platform from which to project an editorial voice outward, to the country beyond. By the eve of the Civil War, the Tribune was reaching a quarter of a million subscribers and many more readers across the northern United States, and Greeley was the most visible and influential newspaper editor in the country. He was, by his own description, a “Public Teacher,” an “oracle” on the Hudson, “exert[ing] a resistless influence over public opinion … creating a community of thought of feeling … giving the right direction to it.” This was the work of journalism. The idea landed with many of the readers who received the Tribune’s weekly edition. They regarded it as they would their own local weeklies: written, composed, and printed by one person. Greeley, in their belief, produced every word. He did little to discourage such impressions, even as the paper became a strikingly modern operation with a corps of editors, armies of compositors and printers, and massive steam-powered presses. “For whatever is distinctive in the views or doctrines of The Tribune,” he wrote in 1847, “there is but one person responsible.” Horace Greeley never quite fixed popular newspapers, or the society that spawned them. The Herald continued to thrive, Bennett continued to bluster, crimes and calamities continued to happen. But Greeley did change newspapers. In making the Tribune into a clearinghouse of information as well as ideas, he made reform-minded, opinion-driven journalism commercially viable, and invented the persona of the crusading journalist. For the next three decades, until his death in 1872, Greeley would demonstrate the power—and limits—of that model. James M. Lundberg is a historian at the University of Notre Dame. He is the author of Horace Greeley: Print, Politics, and the Failure of American Nationhood.
607c7bd5be786995e54e354a4ec3705b
https://www.smithsonianmag.com/history/how-i-learned-about-cult-lost-cause-180968426/
How I Learned About the “Cult of the Lost Cause”
How I Learned About the “Cult of the Lost Cause” The Cult of the Lost Cause. There it was in black and white, in a 1999 application to put the equestrian statue of General Pierre Gustave Toutant Beauregard on the National Register of Historic Places. In 2015, after a year of closely guarded discussions about Confederate monuments in New Orleans, most particularly Robert E. Lee, I asked a few members of my staff to go down to the main branch of the public library to get relevant research documents from the city archives. I wanted to know how and why these statues were erected and if there were any legal protections that would prevent us from moving them. It turns out that among news clippings, drawings and maps, they turned up applications to place the statues on the National Register of Historic Places. Preservationists and city and state officials petitioned the United States Department of the Interior, through the National Park Service, for three statues in Louisiana. As part of that application, extensive research was completed to make the historical case for acceptance. Included in the application was an acknowledgment that the reason for the statues’ very existence was the “Cult of the Lost Cause.” The New Orleans mayor who removed the Confederate statues confronts the racism that shapes us and argues for white America to reckon with its past. A passionate, personal, urgent book from the man who sparked a national debate. I had some limited knowledge of the “Lost Cause,” but the word “cult” hit my ear in a different way. The narrative for the National Register of Historic Places application read: The Cult of the Lost Cause had its roots in the Southern search for justification and the need to find a substitute for victory in the Civil War. In attempting to deal with defeat, Southerners created an image of the war as a great heroic epic. A major theme of the Cult of the Lost Cause was the clash of two civilizations, one inferior to the other. The North, “invigorated by constant struggle with nature, had become materialistic, grasping for wealth and power.” The South had a “more generous climate” which had led to a finer society based upon “veracity and honor in man, chastity and fidelity in women.” Like tragic heroes, Southerners had waged a noble but doomed struggle to preserve their superior civilization. There was an element of chivalry in the way the South had fought, achieving noteworthy victories against staggering odds. This was the “Lost Cause” as the late nineteenth century saw it, and a whole generation of Southerners set about glorifying and celebrating it. The more I read, the more I learned that these statues were indeed propaganda put up years, and often decades, after the Union was preserved. During Reconstruction and the 1960s Civil Rights era, there were specific attempts to erect statues like those of Robert E. Lee or Beauregard not only across the South, but indeed, across the country. Well into our journey, the Southern Poverty Law Center put out research showing there were some 700 Confederate memorial monuments and statues erected long after the Civil War.  According to their research, “two distinct periods saw a significant rise in the dedication of monuments and other symbols,” the first around 1900 through the 1920s and the second in the 1950s and 60s. They coincided with the 50th and 100th anniversaries of the Civil War as well as attempted advancements by African-Americans. Some 20-plus years ago when these applications were written, officials understood the tremendous power of the Lost Cause. So why wasn’t this history better known? To the Lost Cause, rewriting the narrative of the war was as important as erecting monuments, and it largely worked. Still to this day, many I know in Louisiana believe the Civil War was more about states’ rights than preserving slavery. Even leaders at the highest levels of our national government try to dispute the cause of the Civil War. To educate myself and make sure I had an accurate understanding of history before taking any action with the monuments, I reached out to some of the leading experts. I called Ken Burns, the great documentarian, who produced the compelling nine-part PBS docuseries on the Civil War in the ’90s that re-aired more recently. I talked to local historians who were part of New Orleans’ 300th anniversary commission. I reached out to American and Civil War historians at Harvard University, the University of Virginia, the United States Military Academy at West Point, Tulane University, Louisiana State University, Rice University and more. All confirmed my reading. After we took the statues down, I began reading the most definitive and expansive work on the Lost Cause and the movement to whitewash history—books such as Lies Across America: What Our Historic Sites Get Wrong and Teaching What Really Happened, by James W. Loewen, a retired University of Vermont sociology professor. Loewen wrote that “the Confederates won with the pen (and the noose) what they could not win on the battlefield: the cause of white supremacy and the dominant understanding of what the war was all about.” The propaganda the Lost Cause adherents were peddling was not only benign myth, it was a lie that distorted history, sought to rationalize lynching, and created a second class of citizenship for African-Americans. With every new piece of history, it became clearer that the symbols were intended to send a specific message to African-Americans. I firmly believe that they had a link to the systems and institutions that we are working to address today. Most importantly, these particular statues do not represent history—they are an affront to it. I knew this sanitizing of history must end, and I did what I could, which was work with our City Council to remove them. We all have to keep pushing. To do that will require us to stretch our minds, to go to places that we intellectually haven’t before. In addition to writings by Loewen, the works of Charles Blow, Michelle Alexander, Dr. Cornel West, Michael Eric Dyson, Orlando Patterson, Bryan Stevenson, and Ta-Nehisi Coates have broadened my view. I remain in awe of the award-winning work of Jesmyn Ward. The writings of friends and mentors Marian Wright Edelman and Henry Louis Gates have inspired me to keep pushing. To chart a better path forward, we must have honest, truthful conversations about our shared history, how it shapes our world today, and what we all have to do to make the world a fairer, more just society. Only then will we truly win the war against the Cult of the Lost Cause.
eb6914663f21594997c2c9248dcb28cb
https://www.smithsonianmag.com/history/how-indigenous-australians-are-still-fighting-their-lands-25-years-after-landmark-court-case-180963893/
How Indigenous Australians Are Still Fighting for Their Lands 25 Years After a Landmark Court Case
How Indigenous Australians Are Still Fighting for Their Lands 25 Years After a Landmark Court Case Eddie Koiki Mabo couldn’t believe his ears. It was 1982, and two professors at Townsville, Australia’s James Cook University, where Mabo worked as a gardener, had just told him he had no right to his native land. Though he’d lived on the mainland for years, his deep connection to Mer Island, one of the Torres Strait Islands off Australia’s northeast coast, never waned. But as Mabo talked about his home, professors Henry Reynolds and Noel Loos realized that Mabo thought Mer still belonged to him and his native community. No, they haltingly told him—under Australian law, it’s government land. When Captain Cook planted a British flag on the continent’s east coast in 1770, he claimed the lands as if no one was there. The entire country was declared terra nullius: “belonging to no one.” Mabo was shocked. Thousands of years living on these lands and indigenous people have no rights to them? He joined with four other plaintiffs to challenge the terra nullius doctrine in court. After a ten-year battle, on June 3, 1992, the High Court of Australia recognized what had always been obvious to the First Australians: They were there first, and they have the right to reclaim the lands they had occupied for 50,000 years. Those rights were cemented in the Native Title Act the following year. The landmark decision—issued 25 years ago this month—changed the lives of Australia’s Aboriginal and Torres Strait Island people. (While both are indigenous to Australia, they have different ancestry.) For cultures so deeply intertwined with the land and sea, reclaiming traditional turf—including hunting areas, rock art sites, fishing grounds and ceremonial lands—meant becoming whole again. “Having that recognition is very dear to my heart,” says Benton Creed of the Wulgurukaba indigenous group, who recently registered a native title claim for lands near Townsville, Queensland on behalf of his family and community. “We can make sure the land is looked after.” That concept of stewardship is central in Torres Strait and Aboriginal law, says Torres Strait Islander hip-hop artist and activist Mau Power. “We are custodians and caretakers of the land. We don’t own the land, the land owns us.” In the years since the decision, more than 300 claims have been granted across Australia, comprising some 927,000 square miles — 25 percent of the continent. They range from the massive 39,000 square mile Wajarri Yamatji claim in remote Western Australia -- about the size of Kentucky — to the Kaurareg people’s claim on a group of small islands in the Torres Strait that include the spot where Captain Cook claimed Australia for the Crown in 1770. When native title claims overlap cities or other developed areas, a compromise is often struck to maintain existing uses of certain lands. (These lands aren’t reservations—unlike Australian “missions” where some indigenous Australians were forced to live, the claims apply to those lands traditionally occupied by first Australians.) “When we look across this great land, we know that we hold at least 40 percent of this continent, and we hold the beauty of this country,” Aboriginal and Torres Strait Islander social justice commissioner June Oscar, of the Bunuba people, told a crowd at the recent National Native Title Conference in Townsville. “And we hold the aspirations for our future.” Mabo never enjoyed the rights his case secured; he died of cancer five months before the High Court handed down his victory. His daughter, Gail Mabo, delivered an emotional tribute to her father at the gathering. “Mabo is the strength of what native title is, and you can never forget what my father did, because it’s not just what my father did but how he did it — how he rallied all those people and brought them together as one.” Today, a quarter-century after the Mabo decision, almost every public event, from academic talks to concerts to political protests, begins with a “Welcome to Country”—an Aboriginal hospitality ritual that invites guests in and pays respects to traditional owners of the land through the ages. (When delivered by a non-indigenous Australian, it’s called an “Acknowledgement of Country.”) “It is a living culture, and just reminding people of that history and culture is part of that acknowledgement of country,” says Justin Mohamed, chief executive officer of the nonprofit group Reconciliation Australia. While it’s not required by law, it’s become increasingly common throughout Australia over the years, he adds. Yet laying claim to that country has proven far more fraught than anyone expected. “The whole process is very draining,” says Creed. Applicants have to provide detailed documentation proving their historic connection to, or occupation of, the lands they’re claiming to the courts. That means hiring archaeologists and lawyers to track down historical records and verify claims. For the “Stolen Generations”—those taken from their families and homelands as children to be “acclimated” into Australian society—the documentation requirements effectively shut them out of the very homelands they were taken from. “The native title process requires us to prove our ongoing connection to the land, despite the forcible removal of generations of children,” says Mick Dodson, a central figure in the long struggle for indigenous rights, at the conference. “This causes a unique form of trauma and pain.” And while native title rights are enshrined in Australian law, they’re not always upheld. A court decision in the early 2000s held that the rights of ranchers and farmers leasing lands in the state of Western Australia prevailed over the native title rights of the Miriuwung and Gajerrong peoples. The court agreed with the plaintiffs that certain "existing interests," like grazing, can “extinguish” native title claims. Indigenous groups with strong ties to the sea have had special difficulty in securing and defending their customary rights. While the Native Title Act was later amended to specifically confer sea rights, those claims can put indigenous groups at odds with the commercial fishing industry. “The struggle for sea country has been just as hard as the original battle,” acknowledged Nigel Scullion, Australia’s minister of indigenous affairs, during a speech at the conference. “The artificial distinction between land and saltwater country should not exist.” The Commonwealth government, he announced at the meeting, will dedicate $20 million to help detangle those rights and support indigenous fishing businesses and other economic opportunities. But it will take more than funding to fully right the wrongs of the past, Dodson says. “The human suffering of the indigenous people in this country cannot be assuaged by the opening of the purse,” he told a crowded auditorium. “It can only be assuaged by the opening of their hearts.” That's what many had in mind at a different First Nations conference near Uluru. There, indigenous groups and officials came together to propose a number of reforms, including enshrining Aboriginal and Torres Strait Islander rights in the Australian Constitution and establishing an indigenous advisory group to weigh in on government decisions. The groups issued a "statement of the heart" that calls for "a fair and truthful relationship with the people of Australia and a better future for our children based on justice and self-determination.” “It was probably one of the most empowering meetings I’ve been involved in in my 26 years working in Aboriginal affairs,” Mohamed says. “We’ve got some strong agreement and support. I walked away really inspired.” Power, for his part, is betting on Australia's youth. He sees signs that over the next 25 years, the next generation will make sure the promise of Mabo’s unlikely victory will be realized. “Just going around traveling, I’ve seen that the young children are more engaged, and even people of all walks and cultures are expressing interest,” said Power after his performance at the Mabo Day Festival on the anniversary of the High Court’s decision. Indigenous youth leaders carrying the Mabo torch are finding encouragement in high places. In late May, during Australia's Reconciliation Week, 50 Aboriginal and Torres Strait Islander youth leaders -- Indigenous Youth Parliamentarians -- spent a week in Canberra, the Australian capital, getting schooled in the ways of politics. "Our future is bright and I can see how we can quickly grow from five indigenous members of our parliament to many more, given the talent, the passion and the energy of the people here today," Australian Prime Minister Malcolm Turnbull told them. "We look forward to one day soon to the first Aboriginal or Torres Strait Islander Prime Minister. What a great moment that would be." Since Mabo’s victory, eight indigenous people have served in Parliament—up from just two in the years leading up to the landmark case. On June 3, the anniversary of the Mabo decision, Power released a tribute to Eddie Mabo. “Koiki" —Power’s reimagining of a tune co-written by Gail Mabo several years ago — tells the story of Mabo’s journey from local activist to national hero and his enduring legacy. As the deep sea tones of the Bu shell fade, he raps: His story was one about birthright History will remember this great fight
4c9c0f9215d09be48a83e04d3d0683d4
https://www.smithsonianmag.com/history/how-much-do-we-really-know-about-pocahontas-4206184/
How Much Do We Really Know About Pocahontas
How Much Do We Really Know About Pocahontas Pocahontas is the most myth-encrusted figure in early America, a romantic “princess” who saves John Smith and the struggling Jamestown colony. But this fairy tale, familiar to millions today from storybook and film, bears little resemblance to the extraordinary young woman who crossed cultures and oceans in her brief and ultimately tragic life. The startling artwork (above), the oldest in the National Portrait Gallery collection, is the only image of Pocahontas taken from life. Made during her visit to London in 1616, the engraving depicts a stylish lady in beaver hat and embroidered velvet mantle, clutching an ostrich feather fan. Only her high cheekbones and almond-shaped eyes hint at her origins far from London. The inscription is also striking; it identifies her not as Pocahontas, but as “Matoaka” and “Rebecca.” In short, there seems little to link this peculiar figure, peering from above a starched white ruff, with the buck-skinned Indian maiden of American lore. So which image is closer to the woman we know as Pocahontas? She was born Matoaka, in the mid-1590s, the daughter of Powhatan, who ruled a native empire in what is now eastern Virginia. Powhatan had dozens of children, and power in his culture passed between males. But she did attract special notice for her beauty and liveliness; hence Pocahontas, a nickname meaning, roughly, “playful one.” This was also the name she was known by to the English who settled near her home in 1607. John Smith, an early leader in Jamestown, described her as beautiful in “feature, countenance, and proportion” and filled with “wit and spirit.” But contrary to her depiction in films by Disney and others, Pocahontas wasn’t a busty teenager when the English encountered her. Smith called her “A child of ten years old,” while another colonist described her as a “young girle,” cartwheeling naked through Jamestown. There is no evidence of romance between her and Smith (a lifelong bachelor, who, to judge from his own portrait, was far from handsome). Nor is there a firm basis for the tale of Pocahontas saving the English captain from execution by flinging her body across his. The only source for this story is Smith, who exaggerated many of his exploits and didn’t mention his rescue by Pocahontas until 17 years after it allegedly occurred. She did, however, help save Jamestown from starvation and Indian attack. She brought the colonists food, acted as an intermediary and warned the English of an impending ambush by her father. Smith lauded Pocahontas for this aid and gave her trinkets, but a few years later, the English kidnapped her and demanded a ransom of corn and captives held by Powhatan. When Powhatan failed to satisfy the English, his now-teenaged daughter stayed with the colonists. Whether she did so by choice isn’t clear, since all that’s known of her words and thoughts come from accounts by the English. One of them was John Rolfe, a widowed settler and pioneer planter of a new strain of tobacco. He was besotted by Pocahontas and wrote that she showed a “great appearance of love to me.” In 1614 she was baptized Rebecca (after the biblical bride who carried “two nations...in thy womb”) and wed Rolfe, with both natives and colonists present. Jamestown flourished thanks to Rolfe’s tobacco, and his marriage brought a short-lived peace to Virginia. It also provided an opportunity for the colony’s stockholders to tout their success in planting a cash crop and “civilizing” heathen natives. And so, in 1616, the Rolfes and their infant son sailed for London on a marketing trip sponsored by the Virginia Company. Pocahontas attended balls and plays, impressing the English with her manners and appearance, and sat for her portrait bedecked in courtly regalia. The copper-plate engraving, by the Dutch artist Simon van de Passe, was published in a volume devoted to English royalty. The inscription beneath her image makes clear the portrait’s message: Matoaka, daughter of an Indian “Emperour,” had been “converted and baptized,” becoming Rebecca Rolfe, a respectable, thriving and thoroughly Anglicized lady. But look closely at the portrait. Pocahontas appears grave, her cheeks are sunken and her hand is skeletal. Perhaps this was simply the artist’s rendering. But it may have reflected her failing health. In common with so many natives exposed to Europeans in this period, she and her young son fell ill in England, possibly from tuberculosis. Soon after the Rolfes set sail for Virginia, Pocahontas had to be brought ashore at the Thames port of Gravesend. She died there in March 1617, at the age of about 21. Rolfe, who “much lamented” her death, returned to Virginia and later married an Englishwoman. His son by Pocahontas, Thomas Rolfe, inherited his father’s plantation, married a colonist and joined the militia, which vanquished his mother’s people when they rose up a last time in rebellion. Most of this sad history was lost in the romantic mist that enveloped Pocahontas in later centuries. Her burial site in a Gravesend churchyard has also vanished. All that remains is her enigmatic life portrait, a Mona Lisa without a smile, whose thoughts we can only imagine. “I would give a thousand pelts,” Neil Young wailed in his ballad “Pocahontas,” to “find out how she felt.” Smithsonian’s history columnist, Tony Horwitz is the author of seven books and was awarded a Pulitzer Prize for his reporting on the harsh conditions faced by low-wage U.S. workers. Tony Horwitz was a Pulitzer Prize-winning journalist who worked as a foreign correspondent for the Wall Street Journal and wrote for the New Yorker. He is the author of Baghdad without a Map, Midnight Rising and the digital best seller BOOM. His most recent work, Spying on the South, was released in May 2019. Tony Horwitz died in May 2019 at the age of 60.
c2d7b552023a1d56ffc4bf6e1e34f103
https://www.smithsonianmag.com/history/how-oil-spill-50-years-ago-inspired-first-earth-day-180972007/
How an Oil Spill 50 Years Ago Inspired the First Earth Day
How an Oil Spill 50 Years Ago Inspired the First Earth Day Forty-nine years ago, on April 22, 1970, University of Southern California students affixed a gas mask to a statue of their mascot, Tommy Trojan, and buried an engine to symbolize the fight against pollution. In Colorado, a throng of bikers swarmed the state capitol. Volunteers picked up five tons of trash in West Virginia. All across the United States, teach-ins and demonstrations for the inaugural Earth Day would go down in history as a galvanizing moment for the environmental movement. But Earth Day’s roots lie in an earlier tragedy: a gargantuan oil spill that sullied the Santa Barbara coastline and put a national spotlight on pollution. Fifteen months before the first Earth Day, on January 28, 1969, oil started pooling in a black, tarry slick above the sea, six miles from the postcard-perfect shores of Southern California. The community, despite its concern about permitting drilling in federal waters, hadn’t been able to weigh in on the rig known as Platform A. Union Oil persuaded the government to issue a waiver for its fifth well—other areas required protective steel casing to extend at least 300 feet below the ocean floor, but Union Oil got permission to install only 239 feet of casing for the new well. The shortcut proved costly. The pressure prompted a blowout on the fourteenth day of drilling, jetting mud 90 feet above the platform’s floor. The company tried to staunch the oil flow from the well, but soon, oilmen noticed the sea bubbling. The buildup of pressure caused natural gas and oil to find and spew through fissures in the ocean floor. For the first 11 days of the spill, oil escaped at a rate of almost 9,000 gallons an hour. By the time Union Oil managed to stop the leakage, roughly three million gallons (4.5 Olympic swimming pools’ worth of oil) had spread over 35 miles. It ranked as the worst oil spill in the country’s history. (Fifty years later, after even more disastrous oil spills, it’s now the third-largest.) Paul Relis, then a student at the University of California, Santa Barbara (UCSB), finagled his way onto a flight over the spill. He recounted the scene in an oral history compiled by Pacific Standard: “I remember looking straight down into this huge upwelling of black out of the ocean. And I just instantly thought, this is going to change the world.” The disaster prompted Relis to help found an ecology center, one of the earliest such environmental information hubs in the nation. The spill jolted other residents into action, too. Within the first week, local activists created a grassroots group called Get Oil Out! (GOO!) that clamored for the government to stop drilling in the Santa Barbara Channel. Union Oil enlisted crop-dusting planes to coat the growing slick with dispersant and talc, and the company sent divers to the ocean floor to try to cement the cracks, but these efforts didn’t stop oil from washing onto the beaches in eerily silent waves, coating the feathers of dead loons and Western grebes. Despite attempts to clean and care for the oil-weighted birds, between 3,700 (the official count) and 9,000 (scientists’ estimate) died. As citizens rallied and the oil company rushed to spread 3,000 tons of straw on the beaches to sop up the crude oil, the scene gained a national spotlight. Teresa Sabol Spezio, author of Slick Policy: Environmental and Science Policy in the Aftermath of the Santa Barbara Oil Spill, calls it “the first Technicolor disaster.” President Nixon, recently inaugurated and the owner of a California beachfront property himself, even visited the beach to take in the damage. “The Santa Barbara incident,” he said, “has frankly touched the conscience of the American people.” Other politicians visited the site of the spill as well, including Gaylord Nelson, a Wisconsin senator whose environmental bona fides outpaced the president’s. After a speech at a water quality conference in Santa Barbara that summer, Nelson viewed the damage wreaked by the spill. Afterward, on board a plane to his next speaking gig at Berkeley, the senator read about teach-ins against the Vietnam War. “It suddenly dawned on me,” he later recalled, “why not a nationwide teach-in on the environment?” The idea of Earth Day took root. Writing about the oil spill in January 1970, The New York Times’ environment correspondent Gladwin Hill called it the “ecological ‘shot heard round the world,’” though concern about the environment had been growing before 1969. Americans were starting to question the pre-World War II consensus that pollution was simply an unattractive trade-off for a robust, industrial economy, says environmental historian Adam Rome. This shifting attitude, he explains, stemmed in part from the post-war affluence of the middle class and scientists’ increasing willingness to discuss environmental consequences with the public. People had also begun to notice a troubling pattern, Rome says. New technologies incurred alarming consequences, like cancer linked to nuclear fallout or the herbicide scare that kept cranberries off the Thanksgiving table in 1959. Rachel Carson’s Silent Spring became a best seller in 1962, the 1968 Earthrise photo taken during Apollo 8 revealed the fragility of the planet, Lyndon B. Johnson signed nearly 300 environment-related bills during his time in office and the Sierra Club’s membership doubled from 1960 to 1965, according to a paper Rome published in the Journal of American History. The environmental movement existed before the Santa Barbara spill, but it was still fragmented and without the name we now know it by. The 1969 oil spill was a catalyst that helped change the status quo. “I think [the oil spill] was one of the ultimately most important in a series of accidents or problems that made people realize that a lot of the modern technologies that seemed miraculous … posed unprecedented risks to the health of the environment and ultimately to ourselves,” Rome says. If Santa Barbara caught the attention of the country, Earth Day riveted it. According to his biography, The Man from Clear Lake, after the idea of Earth Day struck him, Nelson founded a non-profit called Environmental Teach-In Inc., coaxed California Republican Pete McCloskey to co-chair the day of learning (it wasn’t dubbed “Earth Day” until a later ad campaign) and announced the event just a month after visiting Santa Barbara. “I am convinced that the same concern the youth of this nation took in changing this nation’s priorities on the war in Vietnam and on civil rights can be shown for the problems of the environment,” he told a crowd in Seattle. Earth Day’s focus on youth involvement was evident in the date, selected to avoid finals and spring break, and the hiring of Denis Hayes, a 25-year-old Stanford graduate, to organize the event. That fall, writes Rome, “the number of student environmental organizations exploded.” As momentum for Earth Day gathered, the aftereffects of the Santa Barbara oil spill made themselves felt in local and national policy. While Get Oil Out!’s efforts to ban drilling in the Santa Barbara channel’s federal waters proved unsuccessful in the long term, the furor over the oil-slicked sea led to the creation of one of the first environmental studies departments in the country at UCSB, a template that would become adopted nationwide. Green-minded lawmakers, like Henry “Scoop” Jackson and Edmund Muskie, used the catastrophe to finally move stymied conservation policies, like the Clean Water Act, forward in Congress. The oil spill gave the bills urgency, because politicians and constituents alike felt that “if [pollution] can happen in Santa Barbara,” a wealthy, upper crust community, “it can really happen anywhere,” Spezio says. Seeing a conservative-leaning area unite against pollution also broadened the environmental movement, enticing more radical, left-leaning thinkers who hoped that “environmental issues could be a wedge that would lead people to a broader critique of American society,” Rome says. When April came, the rallying cry of Earth Day solidified a rag-tag coalition of liberal Democrats, middle-class women, youth activists, conservationists and scientists, Rome explains in his book The Genius of Earth Day. The day of action, which inspired teach-ins at more than 1,500 college campuses, also had practical importance. “Working on Earth Day as an organizer was an incredible education,” Rome says, providing the young planners and speakers with hands-on experience, a network and a deepened investment to the cause. Participants expressed apprehension about sky-darkening air pollution, toxic waste, the Cuyahoga River burning and suburban sprawl overtaking the wilderness. They discussed “survival” long before global warming became a buzzword. Earth Day helped launch, and name, the environmental movement. Such a prominent national display of environmental activism applied political pressure in Washington as well. By the end of 1970, Nixon had formed the Environmental Protection Agency, which would coordinate responses to future contamination disasters. By calling attention to close-to-home environmental issues in communities across the country, Earth Day rallied constituents and gave politicians reason to approve the agency. The National Environmental Policy Act provided communities like Santa Barbara with the chance to offer public comment about federal land use decisions. The Clean Water Act passed in 1972. And by the end of the 1960s, environmental coverage in the media had quadrupled from a decade before. To mark the Santa Barbara oil spill’s one-year anniversary in January 1970, 500 demonstrators blockaded a pier along the beach. Some of the protestors stayed put for 17 hours, until police with tear gas threatened to move them. Denis Hayes, the 25-year-old Earth Day organizer, spoke at the occasion. Eighty-four days before the first Earth Day, the Santa Barbara crowd zealously rallied to the environmental cause. Hayes told Pacific Standard: “It was probably the first really giant crowd I had seen that felt passionately, I mean really passionately, about environmental issues.” Lila Thulin is the digital editorial assistant for Smithsonian magazine.
12e4abf06cf5da9041e15034dd2d37f1
https://www.smithsonianmag.com/history/how-pioneering-woman-health-officer-saved-portland-plague-180974499/
The Pioneering Health Officer Who Saved Portland From the Plague
The Pioneering Health Officer Who Saved Portland From the Plague Esther Pohl was a familiar sight around Portland, Oregon, by the summer of 1907. Thirty-five years old, with wavy hair piled atop her head, she was known for bicycling from house to house visiting the patients of her private obstetrics practice. One of the first women in Oregon to practice medicine, she had also served on the city health board since 1905. But on July 11, 1907, she added a new feather to her cap when the health board unanimously elected her Portland’s health commissioner. That made her the first woman to serve as health officer in a major American city. Pohl began her term battling common infectious diseases of the early 20th century—maladies like smallpox, whooping cough, and tuberculosis, which she called “the greatest evil of this day.” The Oregon Journal called her “one of the best known woman physicians on the coast” as well as “one of the busiest women in the community.” But before the summer of 1907 was through, she’d confront an even more formidable foe: the bubonic plague. Armed with the latest scientific knowledge and determined not to repeat the mistakes of other cities on the Pacific, Pohl marshalled a response that focused on the real enemy driving the plague’s spread: rats—and their fleas. Most famous as a medieval scourge that killed millions across Asia, Europe and Africa in the mid-14th century, the bubonic plague was never fully eradicated from the globe (in fact, it’s still around). The 1907 outbreak that threatened Portland—a city that would grow to over 200,000 people by 1910, making it the fourth-largest metropolis on the West Coast—can be traced back to a wave that began in China in the 19th century and then spread along shipping routes. The disease first made landfall on U.S. territory in Hawaii as the century turned. In Honolulu, several Chinese immigrants died of the plague in 1899. Reaction from local officials was swift: All 10,000 residents of the city’s Chinatown were placed under quarantine in an eight-block area surrounded by armed guards. When the disease spread to a white teenager outside the quarantine zone, officials began burning buildings in a desperate attempt to quell the disease. The next January, a stray spark ignited an 18-day blaze that burned down the city’s entire Chinatown. The devastation was brutal, but it also stopped the plague—at least in Honolulu. In March 1900, the proprietor of a lumber yard named Chick Gin died in a flophouse basement in San Francisco’s Chinatown. Health examiners called to his emaciated body immediately suspected the plague after noticing that his corpse showed swelling in the groin area—a tell-tale sign of the disease (“bubonic” comes from the Greek for groin, boubon). The authorities didn’t even wait until the results were back from the lab to impose a quarantine on Chinatown, trapping about 25,000 people in a 15-block area surrounded by rope. No food was allowed in, and no humans let out. Well-off white San Franciscans were enraged at the disturbance in their daily lives, since much of the city depended on Chinese workers to cook and clean. Yet many comforted themselves with the idea that they weren’t likely to contract the disease themselves. At the time, the plague was often racialized, as though something in the bodies of immigrant communities—particularly Asian communities—made them more susceptible. It was thought that the plague could only thrive in warm locales, and among those who ate rice instead of meat, since their bodies supposedly lacked sufficient protein to fend off the disease. City and state officials did their best to stage a cover-up in San Francisco, denying the plague’s presence. As historian of medicine Tilli Tansey writes for Nature, “California governor Henry Gage—mindful of his state’s annual $25-million fruit harvest and concerned other states would suspect a problem—disparaged ‘the plague fake’ in a letter to U.S. secretary of state John Hay and issued threats to anyone publishing on it.” It took an independent scientific inquiry and finally a concerted disinfection campaign before San Francisco was considered safe again in 1904. Meanwhile, 122 people had died. But the plague wasn’t truly gone from San Francisco—far from it. On May 27, 1907, the city recorded another plague death. This time, however, two key things were different. For one, experts finally had a handle on how the disease was spread: in the guts of fleas carried on rats and other rodents. Although the bacteria that causes the bubonic plague, Yersinia pestis, had been identified back in 1894, at that point scientists were still unclear on how it was spread. At the turn of the century, many believed the bubonic plague was airborne and easily spread from human to human. (Pneumonic plague is spread by droplets, but it’s less common than the bubonic form.) Scientists had long noted that mass die-offs among rats coincided with outbreaks of the plague among humans, but the transmission route wasn’t clear. In 1898, Paul-Louis Simond, a French researcher sent by the Pasteur Institute to the South Asian city of Karachi, demonstrated that infected rat fleas could transmit the plague bacteria, but it took several years and confirmation from other researchers before the idea was well-accepted. “For most of human history, no city had a chance against plague, because they thought its cause was miasma, or sin, or foreigners,” writes Merilee Karr, who covered Pohl’s efforts against the plague for Portland Monthly. “The realization dawned that rats were involved sometime in the eighteenth or nineteenth century. Acting on partial knowledge was dangerous, because just killing rats would have sent fleas hopping off dead rats to look for new hosts.” Another thing that was different by 1907: Because public officials now understood how the disease was spread, they were willing to work together to prevent its transmission. The plague was no longer considered a problem that could be confined to a single location: As a port on the Pacific, Portland was vulnerable to the same flea-infested rats scurrying through the harbor and alleys of San Francisco, not to mention Honolulu or Hong Kong. Although San Francisco lagged once again in mounting an effective response, by August 1907, U.S. public health officials were urging anti-plague measures up and down the West Coast, including an order for all vessels in the region to be fumigated and all rats in the ports exterminated. Esther Pohl went even further. She designed an anti-plague strategy that combined her scientific and technical expertise with an understanding of the power of the press. One of her first big moves, according to Kimberly Jensen—author of Oregon's Doctor to the World: Esther Pohl Lovejoy and a Life in Activism—was to invite reporters and photographers along on her inspection of waterfront. On September 1, 1907, the Oregon Journal published a Sunday exposé headlined “Menace to City’s Health,” describing a horrified Pohl discovering piles of rotting garbage, raw sewage, and a host of “unlovely smells” along the docks. One particular eyesore at the foot of Jefferson Street was used “as a dumping ground and boneyard for all the dilapidated push carts and peddler’s wagons confiscated by police. For half a block there is a wild tangle of milk carts … old rusty iron stoves … worn-out wire cables and rotten wood piles.” The acres of jumbled, broken trash were a perfect breeding ground for rats, not to mention other health issues. A few days later, Pohl reported on the “indescribably filthy” conditions she found to the city’s board of health, calling for property owners—and the city—to be compelled to clean up their messes. The board was supportive, and on September 11, she made a presentation to the city council. She reminded leaders of a spinal meningitis outbreak just a few months before earlier and warned, “Now we are threatened with a much more dreadful disease.” The measures she recommended were multi-pronged: Garbage had to be properly covered; food had to be protected; and rat catchers had to be hired. Pohl asked for $1,000 to fund the work, with the possibility more would be needed. The city council approved her request—and let her know that if she needed it, they’d give her five times that amount of money. “She was a compelling speaker,” says Jensen. “Pohl and women's groups used the media effectively by contacting journalists and photographers to document conditions on the waterfront and other areas to raise public awareness and calls for city action. And business owners were particularly concerned about their bottom line and so the council, aligned with business, voted [for] the money.” Pohl also resisted calls to racialize the plague, even while other local medical experts persisted in drawing a connection between ethnicity and the disease. In December 1907, Oregon state bacteriologist Ralph Matson told the Journal, “If we cannot compel the Hindu, Chinamen and others to live up to our ideals of cleanliness, and if they persist in congregating in hovels and hoarding together like animals ... the strictest kind of exclusion would not be too severe a remedy.” The paper played up his quotes, describing West Coast Chinatowns as “filled with dirt and offal, unsanitary, honeycombed with dark cellars and dark passageways.” But Pohl never singled out Chinatown, or any other residential community. Portland’s Chinatown, which began to take root in the 1850s, was already under stress thanks to federal exclusion acts and racist violence, with numbers declining from a peak of about 10,000 people in 1900 to somewhere around 7,000 in 1910. Pohl avoided racist rhetoric and targeted the waterfront instead, urging every member of the city’s populace to be vigilant. In mid-September, Pohl met with Portland business leaders, emphasizing the importance of a clean and vermin-free waterfront. They agreed and formed a committee to go and compel business owners to clean up. C.W. Hodson, the president of the local commerce club, explained to the Journal, “There isn’t any plague here now and we are hoping that there isn’t going to be any—but there must be something done besides hoping.” According to the Journal, most of the merchants on the waterfront were willing to comply with the club’s orders, having already read about the dangerous conditions in the paper. By mid-September, Pohl also called in outside help: a rat catcher named Aaron Zaik, who had trained in the Black Sea port of Odessa and also worked in New York City and Seattle. The Oregonian emphasized his use of modern methods and chemicals, as well as his mastery over “the psychology and habits of the rodent tribe.” Pohl made him a special deputy on the health board, and was so pleased with his work that after a few weeks she offered his services for free to any property holder. By the end of October, Pohl added a new prong to the city’s rat crusade: a bounty. She offered Portlanders five cents per rat, brought dead or alive to the city crematory, and instructed them in careful handling so the fleas would be killed alongside the rats. Pohl emphasized that killing rats was a civic duty, telling the Oregonian that “everyone in the city, rich and poor, should consider it his duty to exterminate rats.” By December, Jensen writes, “the plague scare was essentially over, and Portland had had no reported cases of the disease.” The co-operation among business, the city council and Pohl was remarkable for a number of reasons, not least the fact that many of the orders had been handed down by a 35-year-old woman at a time when Oregon women didn’t even have the rights to vote. And while multiple reasons factored in, Jensen says that Pohl’s work was key: “Her leadership and her skilled use of publicity made her a touchstone for many people to take action.” In the end, Portland was the only West Coast port city that didn’t have any plague cases in 1907. Karr says via e-mail, “There has still never been a case of bubonic plague within 100 miles of Portland.” She credits the city’s activated population, “Esther Pohl’s leadership, and Portland’s willingness to follow her to save their city and their own lives.” Bess Lovejoy is a writer and editor who lives in Brooklyn. She is the author of Rest in Pieces: The Curious Fates of Famous Corpses.
fc3bc8d08258459f842cc2a95717aeea
https://www.smithsonianmag.com/history/how-teenaged-jewish-boy-went-refugee-assassin-puppet-nazi-propaganda-180971204/
How a Jewish Teenager Went From Refugee to Assassin to Puppet of Nazi Propaganda
How a Jewish Teenager Went From Refugee to Assassin to Puppet of Nazi Propaganda At age 15, Herschel Grynszpan was just another Jewish refugee fleeing Nazi Germany for safe haven in pre-war France. Like the 50,000 others that crossed the border to outrun Adolf Hitler’s reach, Grynszpan received a cold reception in his new country. Anti-Semitism was on the rise; Jewish refugees lived in the poorest parts of cities or were prevented from entering the country at all. The Munich Pact of September 1938 meant France was going to great lengths to prevent war with Germany—and that meant appeasing the Führer. By age 17, however, Grynszpan was perhaps the most famous Jew in the world. After receiving word that his family had been forcibly removed from their home in Germany and deposited at the Polish border, Grynszpan sought revenge. The morning of November 7, 1938, Grynszpan bought a gun and went to the German Embassy in Paris. He didn’t have a clear target—he just wanted to make a point the world couldn’t ignore. When he was ushered into the office of a young diplomat named Ernst vom Rath, Grynszpan shot him without even knowing his name. He willingly submitted to arrest by the French authorities, and immediately made a statement about the treatment of Jews at the hands of the Nazis. But Grynszpan’s plan backfired horrifically. The murder was used by Hitler and his minister of propaganda Joseph Goebbels as justification for the violent riots of Kristallnacht, which started just two days later. According to the Nazis, Grynszpan’s actions proved the Jews of the world would stop at nothing to destroy Aryan Germans. The only reasonable action was to attack the Jews first. A remarkable story of a forgotten seventeen-year-old Jew who was blamed by the Nazis for the anti-Semitic violence and terror known as the Kristallnacht, the pogrom still seen as an initiating event of the Holocaust. The assassination and the years-long game of cat and mouse that came next is the subject of a new book by writer Stephen Koch. Hitler’s Pawn: The Boy Assassin and the Holocaust follows Grynszpan from French prison to German concentration camp as the Nazi regime shuttled him from place to place in the hope of using him as a set-piece in their farcical trial against “world Jewry.” Grynszpan was initially held for 20 months without indictment in French prison, sharing his story with the world as a media darling. But once the war broke out in 1939, Grynszpan lost some of his appeal—at least to the Allied Forces. For the Nazis, he was still an enticing prisoner to be used in a propaganda trial against the Jews. And when France fell to Germany, Grynszpan was quickly handed over. Yet even as he was interrogated and sent from one prison to another, the young man managed to thwart the Nazis’ plan. Grynszpan reverted to a lie that transformed his political assassination into a crime of passion, fabricating a gay relationship with vom Rath in order to discredit the victim and the Nazis more generally. Even knowing their prisoner was lying, the Nazis feared the smear so much that case never went to trial. Although Grynszpan succeeded in preventing a trial from going forward, all traces of him vanish after 1942. Whether he was murdered by the Nazis at that time or later has continually been debated. Historians have generally claimed the Nazis killed him before the end of the war, as no trace of him ever appeared afterwards. In 2016, archivists even claimed to have found a photo of Grynszpan from 1946, at least a year after he was supposed to have been dead, but even that isn’t definitive proof of what happened to him. To learn more about this little-known figure and his role in the World War II, Smithsonian.com spoke with author Stephen Koch, who relied on research by European scholars to write his story. What parallels do you see between this story and the murder of the Archduke Franz Ferdinand in World War I? I did certainly think about the Sarajevo event. And 1938 was only 24 years after the event in Sarajevo. All of Europe, which was still shell-shocked from World War I, would have thought of it, too. The key difference between the two killings is that Ernst vom Rath was not a particularly important diplomat. He was not the archduke. Hitler wanted to get the impression out there that it was the ambassador that Herschel had come to. But it’s hard to overestimate the degree to which people in Europe feared the return of the slaughter of the First World War. First of all, the shooting was actually an isolated incident. It would now be absolutely forgotten if Goebbels and Hitler had not decided to use it as their pretext for Kristallnacht. By a grotesque irony, it accomplished what Herschel set out to do— alert the world to the criminality of the Hitler regime. Would Hitler have preceded with Kristallnacht without someone to blame it on? Yes. One of the important things about Hitler’s expansionism and his more outrageous actions was that he always wanted some pretext and was prepared to have the Gestapo create a pretext when necessary so that he could say something was causing an intense reaction on the part of the German people. Why did Hitler think he needed to justify his regime’s actions to the world? Hitler wanted to look like a head of state, and heads of state were not supposed to set up riots that killed people. Hitler did his best to look like he was a mere bystander at Kristallnacht and it had all been Goebbels’ idea. His expansionism was always based on the idea that he had some kind of claim on the countries he invaded or took over. These pretexts were usually pretty ridiculous, but nonetheless, his propaganda machine would emphasize it tremendously to convince the German people. He wanted people to believe that he was, as the leader of a resurgent newly powerful Germany, asserting German rights in the world. Did the Nazis actually believe their own propaganda? That Herschel was a pawn of some Jewish conspiracy? It’s one of the most extraordinary paranoid fantasies maybe in modern history. But take the two functionaries who were in charge of organizing propaganda and policy around Herschel — Friedrich Grimm and Wolfgang Diewerge. Both of those men had previously come together in an incident in which in Switzerland, an important Nazi had been assassinated by a Jewish fellow named David Frankfurter. They seized on this as an example of world Jewry trying to destroy German resurgence. But the fact is that these “Jewish crimes” on a large scale were entirely paranoia. Herschel Grynszpan was fully aware of what he was doing when he shot vom Rath. How did that affect him later on? Herschel was torn himself in a way that defined the rest of his life. On the one hand, he felt he had done something almost heroic, something wonderful, something that had helped wake up the world to evil. On the other hand, he was horrified by Kristallnacht and that he was used as the pretext. He fasted and prayed every Monday for the rest of his life in penance for having been used in this way and also for having murdered an innocent man. How did the world reaction to his crime? He was used by Goebbels and German propagandists as part of a huge anti-Semitic fantasy and he was used by anti-Nazis like [American journalist] Dorothy Thompson and many others as an example of a tragic child who had been driven to a rash action by Hitler’s crimes. Thompson said, “I want higher justice for this boy.” Isn’t it possible to understand why this child did something that was politically foolish and maybe even immoral, but why he did it after the persecution that has family had been through? Those divided feelings had an impact on the trial, too. Can you talk about why it was delayed in France? Georges Bonnet, as foreign minister, had a fear [of the outcome]. Let’s say Herschel is tried and acquitted. Hitler would be enraged. Let’s say Herschel was tried and sent to the guillotine. The world would be enraged. Bonnet didn’t see any way of handling it that was a winning hand. So he did everything he could to stop it. As time went on and Hitler’s reputation went steadily down [in France and elsewhere], it looked more and more like Herschel would be outright acquitted. France gave Grynszpan to the Gestapo after their country fell to the Nazis. How did Herschel’s attitude change between his imprisonment in France and when he was taken to Germany? [At first] Grynzspan wanted to make his case honestly—that his people were being persecuted and that he was protesting. Then, after he was captured by the Germans, he had to remove himself from history, make himself invisible again, which is what he set out to do [by lying about the relationship he had with vom Rath so the Nazis wouldn’t go forward with the trial]. That’s the heroic part of it that I find very touching. We don’t even know how he died, but we do know he died forgotten. No one cared about Herschel Grynszpan anymore. Is there one theory that you think is more probable for Grynszpan’s death? I tilt toward, without certainty, the idea that he survived late into the war. [Nazi war criminal] Adolf Eichmann’s testimony at his Jerusalem trial was that he met Grynszpan late in the war. That was not 1942, that was more like 1944. Another German official said he knew that the case was never dropped but periodically reviewed. The mystery is, why does everything in the German record stop, vanish, after the decision to put the kibosh on the trial in May of 1942? Eichmann said his associates interrogated Herschel and filed a report, but there is no report in the files. What do you hope readers get out of the book? First of all, a tragic story. A kid did something that he hoped would be right and heroic, and it turned against him. A kid who was used for evil purposes then found a way to defeat an evil purpose. Herschel Grynszpan is in the history books usually for maybe five lines, and that’s the end of it. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
044329c7a079b20581460421a21f226a
https://www.smithsonianmag.com/history/how-the-battle-of-little-bighorn-was-won-63880188/?no-ist
How the Battle of Little Bighorn Was Won
How the Battle of Little Bighorn Was Won Editor’s note: In 1874, an Army expedition led by Lt. Col. George Armstrong Custer found gold in the Black Hills, in present-day South Dakota. At the time, the United States recognized the hills as property of the Sioux Nation, under a treaty the two parties had signed six years before. The Grant administration tried to buy the hills, but the Sioux, considering them sacred ground, refused to sell; in 1876, federal troops were dispatched to force the Sioux onto reservations and pacify the Great Plains. That June, Custer attacked an encampment of Sioux, Cheyenne and Arapaho on the Little Bighorn River, in what is now Montana. [×] CLOSE Video: The Battle of the Greasy Grass The Battle of the Little Bighorn is one of the most studied actions in U.S. military history, and the immense literature on the subject is devoted primarily to answering questions about Custer’s generalship during the fighting. But neither he nor the 209 men in his immediate command survived the day, and an Indian counterattack would pin down seven companies of their fellow 7th Cavalrymen on a hilltop over four miles away. (Of about 400 soldiers on the hilltop, 53 were killed and 60 were wounded before the Indians ended their siege the next day.) The experience of Custer and his men can be reconstructed only by inference. This is not true of the Indian version of the battle. Long-neglected accounts given by more than 50 Indian participants or witnesses provide a means of tracking the fight from the first warning to the killing of the last of Custer’s troopers—a period of about two hours and 15 minutes. In his new book, The Killing of Crazy Horse, veteran reporter Thomas Powers draws on these accounts to present a comprehensive narrative account of the battle as the Indians experienced it. Crazy Horse’s stunning victory over Custer, which both angered and frightened the Army, led to the killing of the chief a year later. “My purpose in telling the story as I did,” Powers says, “was to let the Indians describe what happened, and to identify the moment when Custer’s men disintegrated as a fighting unit and their defeat became inevitable.” The sun was just cracking over the horizon that Sunday, June 25, 1876, as men and boys began taking the horses out to graze. First light was also the time for the women to poke up last night’s cooking fire. The Hunkpapa woman known as Good White Buffalo Woman said later she had often been in camps when war was in the air, but this day was not like that. “The Sioux that morning had no thought of fighting,” she said. “We expected no attack.” Those who saw the assembled encampment said they had never seen one larger. It had come together in March or April, even before the plains started to green up, according to the Oglala warrior He Dog. Indians arriving from distant reservations on the Missouri River had reported that soldiers were coming out to fight, so the various camps made a point of keeping close together. There were at least six, perhaps seven, cheek by jowl, with the Cheyennes at the northern, or downriver, end near the broad ford where Medicine Tail Coulee and Muskrat Creek emptied into the Little Bighorn River. Among the Sioux, the Hunkpapas were at the southern end. Between them along the river’s bends and loops were the Sans Arc, Brulé, Minneconjou, Santee and Oglala. Some said the Oglala were the biggest group, the Hunkpapa next, with perhaps 700 lodges between them. The other circles might have totaled 500 to 600 lodges. That would suggest as many as 6,000 to 7,000 people in all, a third of them men or boys of fighting age. Confusing the question of numbers was the constant arrival and departure of people from the reservations. Those travelers—plus hunters from the camps, women out gathering roots and herbs and seekers of lost horses—were part of an informal early-warning system. There were many late risers this morning because dances the previous night had ended only at first light. One very large tent near the center of the village—probably two lodges raised side by side—was filled with the elders, called chiefs by the whites but “short hairs,” “silent eaters” or “big bellies” by the Indians. As the morning turned hot and sultry, large numbers of adults and children went swimming in the river. The water would have been cold; Black Elk, the future Oglala holy man, then 12, would remember that the river was high with snowmelt from the mountains. It was approaching midafternoon when a report arrived that U.S. troops had been spotted approaching the camp. “We could hardly believe that soldiers were so near,” the Oglala elder Runs the Enemy said later. It made no sense to him or the other men in the big lodge. For one thing, whites never attacked in the middle of the day. For several moments more, Runs the Enemy recalled, “We sat there smoking.” Other reports followed. White Bull, a Minneconjou, was watching over horses near camp when scouts rode down from Ash Creek with news that soldiers had shot and killed an Indian boy at the fork of the creek two or three miles back. Women who had been digging turnips across the river some miles to the east “came riding in all out of breath and reported that soldiers were coming,” said the Oglala chief Thunder Bear. “The country, they said, looked as if filled with smoke, so much dust was there.” The soldiers had shot and killed one of the women. Fast Horn, an Oglala, came in to say he had been shot at by soldiers he saw near the high divide on the way over into the Rosebud valley. But the first warning to bring warriors on the run probably occurred at the Hunkpapa camp around 3 o’clock, when some horse raiders—Arikara (or Ree) Indians working for the soldiers, as it turned out—were seen making a dash for animals grazing in a ravine not far from the camp. Within moments shooting could be heard at the south end of camp. Peace quickly gave way to pandemonium—shouts and cries of women and children, men calling for horses or guns, boys sent to find mothers or sisters, swimmers rushing from the river, men trying to organize resistance, looking to their weapons, painting themselves or tying up their horses’ tails. As warriors rushed out to confront the horse thieves, people at the southernmost end of the Hunkpapa camp were shouting alarm at the sight of approaching soldiers, first glimpsed in a line on horseback a mile or two away. By 10 or 15 minutes past 3 o’clock, Indians had boiled out of the lodges to meet them. Now came the first shots heard back at the council lodge, convincing Runs the Enemy to put his pipe aside at last. “Bullets sounded like hail on tepees and tree tops,” said Little Soldier, a Hunkpapa warrior. The family of chief Gall—two wives and their three children—were shot to death near their lodge at the edge of the camp. But now the Indians were rushing out and shooting back, making show enough to check the attack. The whites dismounted. Every fourth man took the reins of three other horses and led them along with his own into the trees near the river. The other soldiers deployed in a skirmish line of perhaps 100 men. It was all happening very quickly. As the Indians came out to meet the skirmish line, straight ahead, the river was to their left, obscured by thick timber and undergrowth. To the right was open prairie rising away to the west, and beyond the end of the line, a force of mounted Indians rapidly accumulated. These warriors were swinging wide, swooping around the end of the line. Some of the Indians, He Dog and Brave Heart among them, rode out still farther, circling a small hill behind the soldiers. By then the soldiers had begun to bend back around to face the Indians behind them. In effect the line had halted; firing was heavy and rapid, but the Indians racing their ponies were hard to hit. Ever-growing numbers of men were rushing out to meet the soldiers while women and children fled. No more than 15 or 20 minutes into the fight the Indians were gaining control of the field; the soldiers were pulling back into the trees that lined the river. The pattern of the Battle of the Little Bighorn was already established—moments of intense fighting, rapid movement, close engagement with men falling dead or wounded, followed by sudden relative quiet as the two sides organized, took stock and prepared for the next clash. As the soldiers disappeared into the trees, Indians by ones and twos cautiously went in after them while others gathered nearby. Shooting fell away but never halted. Two large movements were unfolding simultaneously—most of the women and children were moving north down the river, leaving the Hunkpapa camp behind, while a growing stream of men passed them on the way to the fighting—“where the excitement was going on,” said Eagle Elk, a friend of Red Feather, Crazy Horse’s brother-in-law. Crazy Horse himself, already renowned among the Oglala for his battle prowess, was approaching the scene of the fighting at about the same time. Crazy Horse had been swimming in the river with his friend Yellow Nose when they heard shots. Moments later, horseless, he met Red Feather bridling his pony. “Take any horse,” said Red Feather as he prepared to dash off, but Crazy Horse waited for his own mount. Red Feather didn’t see him again until 10 or 15 minutes later, when the Indians had gathered in force near the woods where the soldiers had taken refuge. It was probably during those minutes that Crazy Horse had prepared himself for war. In the emergency of the moment many men grabbed their weapons and ran toward the shooting, but not all. War was too dangerous to treat casually; a man wanted to be properly dressed and painted before charging the enemy. Without his medicine and time for a prayer or song, he would be weak. A 17-year-old Oglala named Standing Bear reported that after the first warnings Crazy Horse had called on a wicasa wakan (medicine man) to invoke the spirits and then took so much time over his preparations “that many of his warriors became impatient.” Ten young men who had sworn to follow Crazy Horse “anywhere in battle” were standing nearby. He dusted himself and his companions with a fistful of dry earth gathered up from a hill left by a mole or gopher, a young Oglala named Spider would recall. Into his hair Crazy Horse wove some long stems of grass, according to Spider. Then he opened the medicine bag he carried about his neck, took from it a pinch of stuff “and burned it as a sacrifice upon a fire of buffalo chips which another warrior had prepared.” The wisp of smoke, he believed, carried his prayer to the heavens. (Others reported that Crazy Horse painted his face with hail spots and dusted his horse with the dry earth.) Now, according to Spider and Standing Bear, he was ready to fight. By the time Crazy Horse caught up with his cousin Kicking Bear and Red Feather, it was hard to see the soldiers in the woods, but there was a lot of shooting; bullets clattered through tree limbs and sent leaves fluttering to the ground. Several Indians had already been killed, and others were wounded. There was shouting and singing; some women who had stayed behind were calling out the high-pitched, ululating cry called the tremolo. Iron Hawk, a leading man of Crazy Horse’s band of Oglala, said his aunt was urging on the arriving warriors with a song: Brothers-in-law, now your friends have come. Take courage. Would you see me taken captive? At just this moment someone near the timber cried out, “Crazy Horse is coming!” From the Indians circling around behind the soldiers came the charge word—“Hokahey!” Many Indians near the woods said that Crazy Horse repeatedly raced his pony past the soldiers, drawing their fire—an act of daring sometimes called a brave run. Red Feather remembered that “some Indian shouted, ‘Give way; let the soldiers out. We can’t get at them in there.’ Soon the soldiers came out and tried to go to the river.” As they bolted out of the woods, Crazy Horse called to the men near him: “Here are some of the soldiers after us again. Do your best, and let us kill them all off today, that they may not trouble us anymore. All ready! Charge!” Crazy Horse and all the rest now raced their horses directly into the soldiers. “Right among them we rode,” said Thunder Bear, “shooting them down as in a buffalo drive.” Horses were shot and soldiers tumbled to the ground; a few managed to pull up behind friends, but on foot most were quickly killed. “All mixed up,” said the Cheyenne Two Moons of the melee. “Sioux, then soldiers, then more Sioux, and all shooting.” Flying Hawk, an Oglala, said it was hard to know exactly what was happening: “The dust was thick and we could hardly see. We got right among the soldiers and killed a lot with our bows and arrows and tomahawks. Crazy Horse was ahead of all, and he killed a lot of them with his war club.” Two Moons said he saw soldiers “drop into the river-bed like buffalo fleeing.” The Minneconjou warrior Red Horse said several troops drowned. Many of the Indians charged across the river after the soldiers and chased them as they raced up the bluffs toward a hill (now known as Reno Hill, for the major who led the soldiers). White Eagle, the son of Oglala chief Horned Horse, was killed in the chase. A soldier stopped just long enough to scalp him—one quick circle-cut with a sharp knife, then a yank on a fistful of hair to rip the skin loose. The whites had the worst of it. More than 30 were killed before they reached the top of the hill and dismounted to make a stand. Among the bodies of men and horses left on the flat by the river below were two wounded Ree scouts. The Oglala Red Hawk said later that “the Indians [who found the scouts] said these Indians wanted to die—that was what they were scouting with the soldiers for; so they killed them and scalped them.” The soldiers’ crossing of the river brought a second breathing spell in the fight. Some of the Indians chased them to the top of the hill, but many others, like Black Elk, lingered to pick up guns and ammunition, to pull the clothes off dead soldiers or to catch runaway horses. Crazy Horse promptly turned back with his men toward the center of the great camp. The only Indian to offer an explanation of his abrupt withdrawal was Gall, who speculated that Crazy Horse and Crow King, a leading man of the Hunkpapa, feared a second attack on the camp from some point north. Gall said they had seen soldiers heading that way along the bluffs on the opposite bank. The fight along the river flat—from the first sighting of soldiers riding toward the Hunkpapa camp until the last of them crossed the river and made their way to the top of the hill—had lasted about an hour. During that time, a second group of soldiers had shown itself at least three times on the eastern heights above the river. The first sighting came only a minute or two after the first group began to ride toward the Hunkpapa camp—about five minutes past 3. Ten minutes later, just before the first group formed a skirmish line, the second group was sighted across the river again, this time on the very hill where the first group would take shelter after their mad retreat across the river. At about half-past 3, the second group was seen yet again on a high point above the river not quite halfway between Reno Hill and the Cheyenne village at the northern end of the big camp. By then the first group was retreating into the timber. It is likely that the second group of soldiers got their first clear view of the long sprawl of the Indian camp from this high bluff, later called Weir Point. The Yanktonais White Thunder said he saw the second group make a move toward the river south of the ford by the Cheyenne camp, then turn back on reaching “a steep cut bank which they could not get down.” While the soldiers retraced their steps, White Thunder and some of his friends went east up and over the high ground to the other side, where they were soon joined by many other Indians. In effect, White Thunder said, the second group of soldiers had been surrounded even before they began to fight. From the spot where the first group of soldiers retreated across the river to the next crossing place at the northern end of the big camp was about three miles—roughly a 20-minute ride. Between the two crossings steep bluffs blocked much of the river’s eastern bank, but just beyond the Cheyenne camp was an open stretch of several hundred yards, which later was called Minneconjou Ford. It was here, Indians say, that the second group of soldiers came closest to the river and to the Indian camp. By most Indian accounts it wasn’t very close. Approaching the ford at an angle from the high ground to the southeast was a dry creek bed in a shallow ravine now known as Medicine Tail Coulee. The exact sequence of events is difficult to establish, but it seems likely that the first sighting of soldiers at the upper end of Medicine Tail Coulee occurred at about 4 o’clock, just as the first group of soldiers was making its dash up the bluffs toward Reno Hill and Crazy Horse and his followers were turning back. Two Moons was in the Cheyenne camp when he spotted soldiers coming over an intervening ridge and descending toward the river. Gall and three other Indians were watching the same soldiers from a high point on the eastern side of the river. Well out in front were two soldiers. Ten years later, Gall identified them as Custer and his orderly, but more probably it was not. This man he called Custer was in no hurry, Gall said. Off to Gall’s right, on one of the bluffs upriver, some Indians came into sight as Custer approached. Feather Earring, a Minneconjou, said Indians were just then coming up from the south on that side of the river “in great numbers.” When Custer saw them, Gall said, “his pace became slower and his actions more cautious, and finally he paused altogether to await the coming up of his command. This was the nearest point any of Custer’s party ever got to the river.” At that point, Gall went on, Custer “began to suspect he was in a bad scrape. From that time on Custer acted on the defensive.” Others, including Iron Hawk and Feather Earring, confirmed that Custer and his men got no closer to the river than that—several hundred yards back up the coulee. Most of the soldiers were still farther back up the hill. Some soldiers fired into the Indian camp, which was almost deserted. The few Indians at Minneconjou Ford fired back. The earlier pattern repeated itself. Little stood in the soldiers’ way at first, but within moments more Indians began to arrive, and they kept coming—some crossing the river, others riding up from the south on the east side of the river. By the time 15 or 20 Indians had gathered near the ford, the soldiers had hesitated, then begun to ride up out of Medicine Tail Coulee, heading toward high ground, where they were joined by the rest of Custer’s command. The battle known as the Custer Fight began when the small, leading detachment of soldiers approaching the river retreated toward higher ground at about 4:15. This was the last move the soldiers would take freely; from this moment on everything they did was in response to an Indian attack growing rapidly in intensity. As described by Indian participants, the fighting followed the contour of the ground, and its pace was determined by the time it took for Indians to gather in force and the comparatively few minutes it took for each successive group of soldiers to be killed or driven back. The path of the battle follows a sweeping arc up out of Medicine Tail Coulee across another swale into a depression known as Deep Coulee, which in turn opens up and out into a rising slope cresting at Calhoun Ridge, rising to Calhoun Hill, and then proceeds, still rising, past a depression in the ground identified as the Keogh site to a second elevation known as Custer Hill. The high ground from Calhoun Hill to Custer Hill was what men on the plains called “a backbone.” From the point where the soldiers recoiled away from the river to the lower end of Calhoun Ridge is about three-quarters of a mile—a hard, 20-minute uphill slog for a man on foot. Shave Elk, an Oglala in Crazy Horse’s band, who ran the distance after his horse was shot at the outset of the fight, remembered “how tired he became before he got up there.” From the bottom of Calhoun Ridge to Calhoun Hill is another uphill climb of about a quarter-mile. But it would be a mistake to assume that all of Custer’s command—210 men—advanced in line from one point to another, down one coulee, up the other coulee and so on. Only a small detachment had approached the river. By the time this group rejoined the rest, the soldiers occupied a line from Calhoun Hill along the backbone to Custer Hill, a distance of a little over half a mile. The uphill route from Medicine Tail Coulee over to Deep Coulee and up the ridge toward Custer Hill would have been about a mile and a half or a little more. Red Horse would later say that Custer’s troops “made five different stands.” In each case, combat began and ended in about ten minutes. Think of it as a running fight, as the survivors of each separate clash made their way along the backbone toward Custer at the end; in effect the command collapsed back in on itself. As described by the Indians, this phase of the battle began with the scattering of shots near Minneconjou Ford, unfolding then in brief, devastating clashes at Calhoun Ridge, Calhoun Hill and the Keogh site, climaxing in the killing of Custer and his entourage on Custer Hill and ending with the pursuit and killing of about 30 soldiers who raced on foot from Custer Hill toward the river down a deep ravine. Back at Reno Hill, just over four miles to the south, the soldiers preparing their defenses heard three episodes of heavy firing—one at 4:25 in the afternoon, about ten minutes after Custer’s soldiers turned back from their approach to Minneconjou Ford; a second about 30 minutes later; and a final burst about 15 minutes after that, dying off before 5:15. Distances were great, but the air was still, and the .45/55 caliber round of the cavalry carbine made a thunderous boom. At 5:25 some of Reno’s officers, who had ridden out with their men toward the shooting, glimpsed from Weir Point a distant hillside swarming with mounted Indians who seemed to be shooting at things on the ground. These Indians were not fighting; more likely they were finishing off the wounded, or just following the Indian custom of putting an extra bullet or arrow into an enemy’s body in a gesture of triumph. Once the fighting began it never died away, the last scattering shots continuing until night fell. The officers at Weir Point also saw a general movement of Indians—more Indians than any of them had ever encountered before—heading their way. Soon the forward elements of Reno’s command were exchanging fire with them, and the soldiers quickly returned to Reno Hill. As Custer’s soldiers made their way from the river toward higher ground, the country on three sides was rapidly filling with Indians, in effect pushing as well as following the soldiers uphill. “We chased the soldiers up a long, gradual slope or hill in a direction away from the river and over the ridge where the battle began in good earnest,” said Shave Elk. By the time the soldiers made a stand on “the ridge”—evidently the backbone connecting Calhoun and Custer hills—the Indians had begun to fill the coulees to the south and east. “The officers tried their utmost to keep the soldiers together at this point,” said Red Hawk, “but the horses were unmanageable; they would rear up and fall backward with their riders; some would get away.” Crow King said, “When they saw that they were surrounded they dismounted.” This was cavalry tactics by the book. There was no other way to make a stand or maintain a stout defense. A brief period followed of deliberate fighting on foot. As Indians arrived they got off their horses, sought cover and began to converge on the soldiers. Taking advantage of brush and every little swale or rise in the ground to hide, the Indians made their way uphill “on hands and knees,” said Red Feather. From one moment to the next, the Indians popped up to shoot before dropping back down again. No man on either side could show himself without drawing fire. In battle the Indians often wore their feathers down flat to help in concealment. The soldiers appear to have taken off their hats for the same reason; a number of Indians noted hatless soldiers, some dead and some still fighting. From their position on Calhoun Hill the soldiers were making an orderly, concerted defense. When some Indians approached, a detachment of soldiers rose up and charged downhill on foot, driving the Indians back to the lower end of Calhoun Ridge. Now the soldiers established a regulation skirmish line, each man about five yards from the next, kneeling in order to take “deliberate aim,” according to Yellow Nose, a Cheyenne warrior. Some Indians noted a second skirmish line as well, stretching perhaps 100 yards away along the backbone toward Custer Hill. It was in the fighting around Calhoun Hill, many Indians reported later, that the Indians suffered the most fatalities—11 in all. But almost as soon as the skirmish line was thrown out from Calhoun Hill, some Indians pressed in again, snaking up to shooting distance of the men on Calhoun Ridge; others made their way around to the eastern slope of the hill, where they opened a heavy, deadly fire on soldiers holding the horses. Without horses, Custer’s troops could neither charge nor flee. Loss of the horses also meant loss of the saddlebags with the reserve ammunition, about 50 rounds per man. “As soon as the soldiers on foot had marched over the ridge,” the Yanktonais Daniel White Thunder later told a white missionary, he and the Indians with him “stampeded the horses...by waving their blankets and making a terrible noise.” “We killed all the men who were holding the horses,” Gall said. When a horse holder was shot, the frightened horses would lunge about. “They tried to hold on to their horses,” said Crow King, “but as we pressed closer, they let go their horses.” Many charged down the hill toward the river, adding to the confusion of battle. Some of the Indians quit fighting to chase them. The fighting was intense, bloody, at times hand to hand. Men died by knife and club as well as by gunfire. The Cheyenne Brave Bear saw an officer riding a sorrel horse shoot two Indians with his revolver before he was killed himself. Brave Bear managed to seize the horse. At almost the same moment, Yellow Nose wrenched a cavalry guidon from a soldier who had been using it as a weapon. Eagle Elk, in the thick of the fighting at Calhoun Hill, saw many men killed or horribly wounded; an Indian was “shot through the jaw and was all bloody.” Calhoun Hill was swarming with men, Indian and white. “At this place the soldiers stood in line and made a very good fight,” said Red Hawk. But the soldiers were completely exposed. Many of the men in the skirmish line died where they knelt; when their line collapsed back up the hill, the entire position was rapidly lost. It was at this moment that the Indians won the battle. In the minutes before, the soldiers had held a single, roughly continuous line along the half-mile backbone from Calhoun Hill to Custer Hill. Men had been killed and wounded, but the force had remained largely intact. The Indians heavily outnumbered the whites, but nothing like a rout had begun. What changed everything, according to the Indians, was a sudden and unexpected charge up over the backbone by a large force of Indians on horseback. The central and controlling part Crazy Horse played in this assault was witnessed and later reported by many of his friends and relatives, including He Dog, Red Feather and Flying Hawk. Recall that as Reno’s men were retreating across the river and up the bluffs on the far side, Crazy Horse had headed back toward the center of camp. He had time to reach the mouth of Muskrat Creek and Medicine Tail Coulee by 4:15, just as the small detachment of soldiers observed by Gall had turned back from the river toward higher ground. Flying Hawk said he had followed Crazy Horse down the river past the center of camp. “We came to a ravine,” Flying Hawk later recalled, “then we followed up the gulch to a place in the rear of the soldiers that were making the stand on the hill.” From his half-protected vantage at the head of the ravine, Flying Hawk said, Crazy Horse “shot them as fast as he could load his gun.” This was one style of Sioux fighting. Another was the brave run. Typically the change from one to the other was preceded by no long discussion; a warrior simply perceived that the moment was right. He might shout: “I am going!” Or he might yell “Hokahey!” or give the war trill or clench an eagle bone whistle between his teeth and blow the piercing scree sound. Red Feather said Crazy Horse’s moment came when the two sides were keeping low and popping up to shoot at each other—a standoff moment. “There was a great deal of noise and confusion,” said Waterman, an Arapaho warrior. “The air was heavy with powder smoke, and the Indians were all yelling.” Out of this chaos, said Red Feather, Crazy Horse “came up on horseback” blowing his eagle bone whistle and riding between the length of the two lines of fighters. “Crazy Horse...was the bravest man I ever saw,” said Waterman. “He rode closest to the soldiers, yelling to his warriors. All the soldiers were shooting at him but he was never hit.” After firing their rifles at Crazy Horse, the soldiers had to reload. It was then that the Indians rose up and charged. Among the soldiers, panic ensued; those gathered around Calhoun Hill were suddenly cut off from those stretching along the backbone toward Custer Hill, leaving each bunch vulnerable to the Indians charging them on foot and horseback. The soldiers’ way of fighting was to try to keep an enemy at bay, to kill him from a distance. The instinct of Sioux fighters was the opposite—to charge in and engage the enemy with a quirt, bow or naked hand. There is no terror in battle to equal physical contact—shouting, hot breath, the grip of a hand from a man close enough to smell. The charge of Crazy Horse brought the Indians in among the soldiers, whom they clubbed and stabbed to death. Those soldiers still alive at the southern end of the backbone now made a run for it, grabbing horses if they could, running if they couldn’t. “All were going toward the high ground at end of ridge,” the Brulé Foolish Elk said. The skirmish lines were gone. Men crowded in on each other for safety. Iron Hawk said the Indians followed close behind the fleeing soldiers. “By this time the Indians were taking the guns and cartridges of the dead soldiers and putting these to use,” said Red Hawk. The boom of the Springfield carbines was coming from Indian and white fighters alike. But the killing was mostly one-sided. In the rush of the Calhoun Hill survivors to rejoin the rest of the command, the soldiers fell in no more pattern than scattered corn. In the depression in which the body of Capt. Myles Keogh was found lay the bodies of some 20 men crowded tight around him. But the Indians describe no real fight there, just a rush without letup along the backbone, killing all the way; the line of bodies continued along the backbone. “We circled all round them,” Two Moons said, “swirling like water round a stone.” Another group of the dead, ten or more, was left on the slope rising up to Custer Hill. Between this group and the hill, a distance of about 200 yards, no bodies were found. The mounted soldiers had dashed ahead, leaving the men on foot to fend for themselves. Perhaps the ten who died on the slope were all that remained of the foot soldiers; perhaps no bodies were found on that stretch of ground because organized firing from Custer Hill held the Indians at bay while soldiers ran up the slope. Whatever the cause, Indian accounts mostly agree that there was a pause in the fighting—a moment of positioning, closing in, creeping up. The pause was brief; it offered no time for the soldiers to count survivors. By now, half of Custer’s men were dead, Indians were pressing in from all sides, the horses were wounded, dead or had run off. There was nowhere to hide. “When the horses got to the top of the ridge the gray ones and bays became mingled, and the soldiers with them were all in confusion,” said Foolish Elk. Then he added what no white soldier lived to tell: “The Indians were so numerous that the soldiers could not go any further, and they knew that they had to die.” The Indians surrounding the soldiers on Custer Hill were now joined by others from every section of the field, from downriver where they had been chasing horses, from along the ridge where they had stripped the dead of guns and ammunition, from upriver, where Reno’s men could hear the beginning of the last heavy volley a few minutes past 5. “There were great numbers of us,” said Eagle Bear, an Oglala, “some on horseback, others on foot. Back and forth in front of Custer we passed, firing all of the time.” Kill Eagle, a Blackfeet Sioux, said the firing came in waves. His interviewer noted that he clapped “the palms of his hands together very fast for several minutes” to demonstrate the intensity of the firing at its height, then clapped slower, then faster, then slower, then stopped. In the fight’s final stage, the soldiers killed or wounded very few Indians. As Brave Bear later recalled: “I think Custer saw he was caught in [a] bad place and would like to have gotten out of it if he could, but he was hemmed in all around and could do nothing only to die then.” Exactly when custer died is unknown; his body was found in a pile of soldiers near the top of Custer Hill surrounded by others within a circle of dead horses. It is probable he fell during the Indians’ second, brief and final charge. Before it began, Low Dog, an Oglala, had called to his followers: “This is a good day to die: follow me.” The Indians raced up together, a solid mass, close enough to whip each other’s horses with their quirts so no man would linger. “Then every chief rushed his horse on the white soldiers, and all our warriors did the same,” said Crow King. In their terror some soldiers threw down their guns, put their hands in the air and begged to be taken prisoner. But the Sioux took only women as prisoners. Red Horse said they “did not take a single soldier, but killed all of them.” The last 40 or more of the soldiers on foot, with only a few on horseback, dashed downhill toward the river. One of the mounted men wore buckskins; Indians said he fought with a big knife. “His men were all covered with white dust,” said Two Moons. These soldiers were met by Indians coming up from the river, including Black Elk. He noted that the soldiers were moving oddly. “They were making their arms go as though they were running, but they were only walking.” They were likely wounded—hobbling, lurching, throwing themselves forward in the hope of escape. The Indians hunted them all down. The Oglala Brings Plenty and Iron Hawk killed two soldiers running up a creek bed and figured they were the last white men to die. Others said the last man dashed away on a fast horse upriver toward Reno Hill, and then inexplicably shot himself in the head with his own revolver. Still another last man, it was reported, was killed by the sons of the noted Santee warrior chief Red Top. Two Moons said no, the last man alive had braids on his shirt (i.e., a sergeant) and rode one of the remaining horses in the final rush for the river. He eluded his pursuers by rounding a hill and making his way back upriver. But just as Two Moons thought this man might escape, a Sioux shot and killed him. Of course none of these “last men” was the last to die. That distinction went to an unknown soldier lying wounded on the field. Soon the hill was swarming with Indians—warriors putting a final bullet into enemies, and women and boys who had climbed the long slopes from the village. They joined the warriors who had dismounted to empty the pockets of the dead soldiers and strip them of their clothes. It was a scene of horror. Many of the bodies were mutilated, but in later years Indians did not like to talk about that. Some said they had seen it but did not know who had done it. But soldiers going over the field in the days following the battle recorded detailed descriptions of the mutilations, and drawings made by Red Horse leave no room for doubt that they took place. Red Horse provided one of the earliest Indian accounts of the battle and, a few years later, made an extraordinary series of more than 40 large drawings of the fighting and of the dead on the field. Many pages were devoted to fallen Indians, each lying in his distinctive dress and headgear. Additional pages showed the dead soldiers, some naked, some half-stripped. Each page depicting the white dead showed severed arms, hands, legs, heads. These mutilations reflected the Indians’ belief that an individual was condemned to have the body he brought with him to the afterlife. Acts of revenge were integral to the Indians’ notion of justice, and they had long memories. The Cheyenne White Necklace, then in her middle 50s and wife of Wolf Chief, had carried in her heart bitter memories of the death of a niece killed in a massacre whites committed at Sand Creek in 1864. “When they found her there, her head was cut off,” she said later. Coming up the hill just after the fighting had ended, White Necklace came upon the naked body of a dead soldier. She had a hand ax in her belt. “I jumped off my horse and did the same to him,” she recalled. Most Indians claimed that no one really knew who the leader of the soldiers was until long after the battle. Others said no, there was talk of Custer the very first day. The Oglala Little Killer, 24 years old at the time, remembered that warriors sang Custer’s name during the dancing in the big camp that night. Nobody knew which body was Custer’s, Little Killer said, but they knew he was there. Sixty years later, in 1937, he remembered a song: Long Hair, Long Hair, I was short of guns, and you brought us many. Long Hair, Long Hair, I was short of horses, and you brought us many. As late as the 1920s, elderly Cheyennes said that two southern Cheyenne women had come upon the body of Custer. He had been shot in the head and in the side. They recognized Custer from the Battle of the Washita in 1868, and had seen him up close the following spring when he had come to make peace with Stone Forehead and smoked with the chiefs in the lodge of the Arrow Keeper. There Custer had promised never again to fight the Cheyennes, and Stone Forehead, to hold him to his promise, had emptied the ashes from the pipe onto Custer’s boots while the general, all unknowing, sat directly beneath the Sacred Arrows that pledged him to tell the truth. It was said that these two women were relatives of Mo-nah-se-tah, a Cheyenne girl whose father Custer’s men had killed at the Washita. Many believed that Mo-nah-se-tah had been Custer’s lover for a time. No matter how brief, this would have been considered a marriage according to Indian custom. On the hill at the Little Bighorn, it was told, the two southern Cheyenne women stopped some Sioux men who were going to cut up Custer’s body. “He is a relative of ours,” they said. The Sioux men went away. Every Cheyenne woman routinely carried a sewing awl in a leather sheath decorated with beads or porcupine quills. The awl was used daily, for sewing clothing or lodge covers, and perhaps most frequently for keeping moccasins in repair. Now the southern Cheyenne women took their awls and pushed them deep into the ears of the man they believed to be Custer. He had not listened to Stone Forehead, they said. He had broken his promise not to fight the Cheyenne anymore. Now, they said, his hearing would be improved. Thomas Powers is the author of eight previous books. Aaron Huey has spent six years documenting life among the Oglala Sioux on the Pine Ridge Reservation in South Dakota. Adapted from The Killing of Crazy Horse, by Thomas Powers. Copyright © 2010. With the permission of the publisher, Alfred A. Knopf.
6a9d04a01fffaee656db4e582be67afc
https://www.smithsonianmag.com/history/how-the-burgess-shale-changed-our-view-of-evolution-3678444/
How the Burgess Shale Changed Our View of Evolution
How the Burgess Shale Changed Our View of Evolution They are, in the opinion of no less an authority than the paleontologist Stephen Jay Gould, “the world’s most important animal fossils”—not Tyrannosaurus rex, not Lucy, but a collection of marine invertebrates mostly a few inches in size, dating from the very dawn of complex life on earth more than 500 million years ago. Their very names—Hallucigenia, Anomalocaris—testify to their strangeness. For decades they have fired the passions of researchers, fueling one of the great scientific controversies of the 20th century, a debate about the nature of life itself. [×] CLOSE Video: 101 Objects: Burgess Shale The discovery of the Burgess Shale fossils, high on a mountainside in the Canadian Rockies, is shrouded in legend. It was late August 1909, and an expedition led by the Smithsonian’s longtime Secretary, Charles D. Walcott, was about to pack up. One tale is that a horse ridden by Walcott’s wife, Helena, slipped, overturning a slab of rock that revealed the first astonishing specimens. Whether or not it happened that way—Gould argued against it—Walcott knew he had found something special, and returned the following year, assembling the nucleus of a collection now numbering some 65,000 specimens representing about 127 species. Some were well known, such as the segmented arthropods known as trilobites, others completely novel. They include Opabinia, a five-eyed creature with a grasping proboscis, whose presentation at a scientific conference was regarded at first as a practical joke; Hallucigenia, a marine worm that earned its name when it was originally reconstructed upside-down, so that it appeared to ambulate on seven pairs of stiltlike spines; and Pikaia, an inch-and-a-half-long creature with a spinal rod called a notochord, the earliest known chordate—the group of animals that would later evolve into vertebrates. This was the full flowering of the “Cambrian explosion,” the sudden appearance of a vast new panoply of life-forms—creeping, burrowing and swimming through seas that had held nothing like them in the previous three billion years. Cambrian fossils are known from many sites, but usually only from remains of shells and other hard parts; here, owing to some accident of geology, entire organisms were preserved with eyes, tissue and other soft parts visible. How to classify this trove has been a contentious question. Walcott conservatively tried to place the creatures into groups that were known from other fossils, or living descendants. But decades later, when the Cambridge geologist Harry Whittington and his colleagues took another look, they realized that the Burgess Shale contained not just unique species, but entire phyla (the broadest classification of animals) new to science. The first European to see a kangaroo could not have been more surprised. What made the creatures seem new is they have no living descendants. They represent entire lineages, major branches on the tree of life, left behind by evolution, most likely in one of the mass extinctions that punctuate the natural history of this planet. Other lineages did survive, including that of the humble Pikaia, which qualifies as at least a collateral ancestor of the vertebrates, including us. And that raises the profound, almost beautiful mystery that Gould saw in the Burgess Shale, the subject of his book Wonderful Life: Why us? Obvious as the dominance of big-brained mammals may seem, nothing in the Burgess Shale suggests that Pikaia’s offspring were destined for greatness, or even survival, compared, say, with the presumed top predator of those oceans, the two-foot-long shrimplike Anomalocaris. The proliferation of wildly different body plans and the apparently random process by which some thrived while others went defunct brought to Gould’s mind a lottery, in which the lineage leading to human beings just happened to have held a winning ticket. If one could somehow turn the clock back to the Cambrian and run the game again, there is no reason to think the outcome would be the same. These little creatures, entombed in rock for a half-billion years, are a reminder that we are so very lucky to be here. A science writer and author of the book High Rise, Jerry Adler is a frequent contributor to Smithsonian. He wrote about the role of fire in shaping human evolution in our June issue. Jerry Adler is a former Newsweek editor.
940f700119beab81f355962f853a64e6
https://www.smithsonianmag.com/history/how-the-flag-came-to-be-called-old-glory-18396/
The Civil War
The Civil War A tale of fidelity, family feud and argument over ownership is the subject of a new inquiry by the Smithsonian National Museum of American History. Old Glory, the weather-beaten 17- by 10-foot banner that has long been a primary NMAH artifact, is second only to Francis Scott Key’s Star-Spangled Banner as a patriotic symbol, and is the source of the term now applied generically to all American flags. “It represents success, righteousness, sovereignty,” says museum director John Gray, but also a conflict that is still “deeply contested in our souls.” [×] CLOSE Photo Gallery During the Civil War, no flag became a more popular symbol of Union loyalty than the worn and imperiled standard belonging to 19th-century sea captain William Driver, who was originally from Salem, Massachusetts. His defiant flying of it—from his Nashville, Tennessee, household during the midst of the conflict— made national news. Civil War-era citizens felt so passionately about flags that after the surrender of Fort Sumter, the garrison ensign toured the country for the duration of the war. The poet and hospital attendant Walt Whitman lamented the amount of blood spent to retain a simple, four-cornered regimental rag. “I have a little flag....It was taken by the Secesh [secessionists] in a cavalry fight, and rescued by our men in a bloody little skirmish,” Whitman wrote. “It cost three men’s lives, just to get one little flag, four by three.” The flag was originally designed to unfurl grandly from a ship’s mast. Driver received the homemade flag with 24 stars in 1824, sewn for him by his mother and a group of young Salem female admirers to celebrate his appointment, at the age of just 21, as a master mariner and commander of his own ship, the Charles Doggett. According to legend, when Driver raised the flag up the main mast, he lifted his hat and declaimed, “My ship, my country, and my flag, Old Glory.” However, Salem historian Bonnie Hurd Smith has found “no evidence whatsoever” that Driver made such a stiffly grandiose pronouncement. He more likely named the flag when reflecting on his adventurous 20-year career as an American merchant seaman who sailed to China, India, Gibraltar and throughout the South Pacific, at one point ferrying survivors of the HMS Bounty from Tahiti to Pitcairn Island under the flag. “It has ever been my staunch companion and protection,” he wrote. “Savages and heathens, lowly and oppressed, hailed and welcomed it at the far end of the wide world. Then, why should it not be called Old Glory?” A portrait of Driver as a young captain shows a dashing man with black sideburns, a confident smile and a frothy white shirt. He made profits in the tortoise-shell trade, and could converse a bit in Fijian. Family memoirs tell stories of him seizing the wheel of his ship himself in gales, and facing down a hostile tribal chief in New Zealand with a pistol in hand and a dirk in his mouth. “The flag embodied America as he knew it at that point, going across the world,” says NMAH curator Jennifer Locke Jones. “He carried it with him and it was the pride of this independent free spirit. He was taking a bit of America to uncharted territories and he felt very proud that this was the symbol he flew under. He took a piece of his home with him wherever he went.” In 1837, Driver gave up seafaring after his wife, Martha Silsbee Babbage, died from throat cancer, leaving him with three young children. Driver decided to settle in Nashville, where his three brothers had opened a store. Only 34 years old, he quickly remarried the next year, choosing a Southern girl less than half his age, Sarah Jane Parks, and started a second family that grew to nine children. Driver flew his flag on holidays “rain or shine,” according to one of his Nashville-born daughters, Mary Jane Roland. It was so large that he attached it to a rope from his attic window and stretched it on a pulley across the street to secure it to a locust tree. In 1860, according to Roland, he and his wife and daughters repaired it, sewing on the additional ten stars, and Driver himself appliquéd a small white anchor in the lower right corner to signify his career. But as secession neared, Driver’s flag became a source of contention, and by the outbreak of the war, Driver’s own family was bitterly riven. Two of his sons were fervent Confederates and enlisted in local regiments; one of them would later die of his wounds at the Battle of Perryville. One can only imagine the tensions between the Salem-born and Nashville-born Drivers, whose relations may have already been strained by first- and second-family rivalry. In March 1862, Driver wrote despairingly, “Two sons in the army of the South! My entire house estranged...and when I come home...no one to soothe me.” Local Confederates attempted to seize Old Glory soon after Tennessee seceded. When Gov. Isham G. Harris sent a committee to Driver’s house to demand the flag, Driver met the men at the door. Picture a defiant 58-year-old with a chest still barrel-full and an out-thrust chin. “Gentlemen...if you are looking for stolen property in my house, produce your search-warrant,” he declared. Cowed, the committee left the premises. Unsatisfied, local guerrillas made another attempt to seize the flag. When an armed squad arrived on Driver’s front porch, he stalked out to confront them. “If you want my flag you’ll have to take it over my dead body,” he threatened. They retreated. Driver, by now convinced that the flag was in imminent danger, decided to hide it. With the help of the more loyal women in a neighboring household, it was sewn into a coverlet. It remained there until late February 1862, when Nashville became the first Southern capital to fall. Union troops led by the Sixth Ohio entered the city. When Driver saw the Stars and Stripes and regimental colors of the Sixth Ohio go up the flagstaff of the capitol, he made his way there and sought out the Union commander, Gen. William “Bull” Nelson. As Nelson’s aide Horace Fisher recalled it, “A stout, middle-aged man, with hair well shot with gray, short in stature, broad in shoulder, and with a roll in his gait, came forward and asked, ‘Who is the General in command? I wish to see him.’” Driver introduced himself as a former sea captain and loyal Unionist and then produced his coverlet. Fisher recalled: “Capt. Driver—an honest-looking, blunt-speaking man, was evidently a character; he carried on his arm a calico-covered bedquilt; and, when satisfied that Gen. Nelson was the officer in command, he pulled out his jack-knife and began to rip open the bedquilt without another word. We were puzzled to think what his conduct meant.” Finally, Fisher added, “the bedquilt was safely delivered of a large American flag, which he handed to Gen. Nelson, saying, ‘This is the flag I hope to see hoisted on that flagstaff in place of the [damned] Confederate flag set there by that [damned] rebel governor, Isham G. Harris. I have had hard work to save it; my house has been searched for it more than once.’ He spoke triumphantly, with tears in his eyes.” General Nelson accepted the flag and ordered it run up on the statehouse flagstaff. Roland claimed to have witnessed what happened next: It was greeted with “frantic cheering and uproarious demonstrations by soldiers,” many of them from the Sixth Ohio. The regiment would adopt “Old Glory” as its motto. The confusion over flags began later that night, when a storm threatened to tear the banner to pieces. Driver apparently replaced it with a newer, stronger one, and once again stowed Old Glory away for safekeeping. There were also reports that Driver gave a flag to the Sixth Ohio as it left the city. According to Roland, however, the main flag remained stored in the Driver home until December 1864 and the second battle for Nashville. Confederate Gen. John Bell Hood fought his army to bits trying to retake the city. As the battle raged, Driver hung his flag out of the third-story window “in plain sight,” according to Roland. He then went to join the defense of the city, telling his household before he left, “If Old Glory is not in sight, I’ll blow the house out of sight too.” Driver spent the rest of the war as a provost marshal of Nashville and worked in hospitals. According to Roland, several years before his death, he gave her the flag as a gift, on July 10, 1873. “This is my old ship flag Old Glory,” he told her. “I love it as a mother loves her child; take it and cherish it as I have always cherished it; for it has been my steadfast friend and protector in all parts of the world—savage, heathen and civilized.” *** William Driver died on March 3, 1886, and was buried in Nashville. That same year saw the genesis of the family feud over the flag when his niece, Harriet Ruth Waters Cooke, daughter of his youngest sister and a Salem-born socialite highly conscious of her genealogy, claimed to have inherited it. She presented her version of Old Glory to the Essex Institute in Salem (now the Peabody Essex Museum), along with family memorabilia that included a letter from the Pitcairn Islanders to Driver. Why Driver would have given his precious flag to a niece in far-off Massachusetts is unclear—perhaps because he didn’t trust his Confederate-sympathizing children to care for it? Cooke also produced a family memoir that she self-published in 1889, in which she omitted the existence of Driver’s daughter Mary Jane. Roland fought back. She set about documenting the history of the flag her father had given her, and in 1918 published her own account, Old Glory, The True Story, in which she disputed elements of Cooke’s narrative and presented documentary evidence for her claim. In 1922, Roland presented her Old Glory as a gift to President Warren G. Harding, who in turn delivered it to the Smithsonian. That same year, the Peabody Essex also sent its Old Glory to the Smithsonian. But the museum chose to regard Roland’s flag as the more important one: It was directly descended from Driver, and documentary evidence in the Tennessee State Library and Archives strongly suggested it was the one hidden in the quilt and presented to Union troops who took Nashville.It also had common sense on its side: Driver would have hoisted his largest flag over the capitol dome. The Peabody flag sank into insignificance. It has remained on loan at the Smithsonian since 1922, but has gone largely unexamined, given the emphasis on the larger Old Glory. However, it became the subject of renewed curiosity this July during a conservation evaluation of both flags by curator Jones and textile conservator Suzanne Thomassen-Krauss. As they surveyed both flags, they began discussing the odd family history, which has been periodically resurrected in local Salem news stories along with suggestions that the Peabody flag might have a legitimate claim. They decided to embark on a more exhaustive analysis of both flags. It’s unlikely that the Smithsonian project will lay to rest the 125-year-old family quarrel. Nor is it likely that the smaller, 12- by 6-foot Peabody flag will supplant the traditional Old Glory in the eyes of Smithsonian curators, who report that the preliminary study indicates that the larger flag still has the much stronger claim. But the Peabody flag is a historical curiosity in its own right, says Jones. Initial analysis shows it is a legitimate Driver family heirloom and Civil War-era relic, but it is also something of a mystery, with several anomalies. According to textile preservationist Fonda Thomsen, who has helped conserve articles ranging from flags to the garments President Lincoln was wearing when he was assassinated, a single thread can tell a story. Each flag will contain signatures, clues left in seams and stitching, as well as in the dyes and materials used. “You can determine, were they made by the same person?” Thomsen says. “Did they finish their seam the same way, the stars the same way? How did they knot it off? Everybody leaves a little trail of their work.” Although the Old Glory textile project is just beginning, there have already been a couple of definitive conclusions. While the Peabody flag clearly dates to the same era as the larger Old Glory, it lacks the wear and tear of a seagoing flag. The fly edge is intact and not worn. In fact it seems as if the flag was hardly flown. “What we’re looking at is inconsistent with use on a naval vessel,” Jones says. There are also baffling soil lines on the flag, and parts of it appear to be newer than others. “We’re thinkingparts of it are older, and parts are questionable,” Jones says. “It could be that it was remade.” The larger Old Glory has wear and tear consistent with seafaring. It was indeed made during the 1820s and has all the earmarks of a heavily used naval flag. Its fly edge shows signs of wear, suggesting it spent a lot of time flapping in stiff winds. “When a flag is flown, you get distortion of the fabric, and wear on the leading edge,” Thomsen says. “It beats the bejesus out of them.” This does not mean the Peabody flag is illegitimate. Captain Driver would have had more than one flag: Ship captains carried ceremonial flags, storm flags and flags designed to be visible from very long distances. Driver family memoirs and other records contain references to a “merino” flag owned by the captain, a storm flag, and then there was the flag that was draped over his coffin. The Peabody flag surely has a story in its own right. “We’re looking at where it resided, the history of it and then, at the object itself, asking, ‘What are you telling us?’” Jones says. Paula Richter, curator for the Peabody Essex, is awaiting the outcome of the analysis before she offers an opinion. “It does seem like there is a growing consensus that the Smithsonian’s is the actual Old Glory, but it’s interesting to think about the relationship [of the two flags] to each other,” she says. Also intriguing is the fact that the Peabody Essex Museum’s card catalog contains other “remnants” of flags purporting to be pieces of Old Glory, gifts from various donors. These may well be pieces of Old Glory—“souvenir” patches that were cut away, a common practice with treasured Civil War banners. There is no evidence of “souveniring” of the Peabody flag. But Jones believes that other items from the Peabody Essex catalog may match the weave of the Smithsonian flag. Each vestige, even the most fragmentary scrap, is potentially meaningful. “Pieces of those flags are held sacred,” Jones says. "They embody a common experience."
e38e007a5982e8f4667339178702d600
https://www.smithsonianmag.com/history/how-three-doughboys-experienced-last-days-world-war-i-180970701/
World War I: 100 Years Later
World War I: 100 Years Later Sgt. Harold J. Higginbottom. 2nd Lt. Thomas Jabine. Brigadier General Amos A. Fries. When these three U.S. servicemen heard the news about the armistice ending the First World War, they were in three very different circumstances. Their stories, told below in an excerpt from Theo Emery’s Hellfire Boys: The Birth of the U.S. Chemical Warfare Service and the Race for the World’s Deadliest Weapons, offer a window into how the war was still running hot until its very last hours. While Emery’s book details the rapid research and development of chemical weapons in the U.S. during the war and the young men in the First Gas Regiment, it also connects readers to the seemingly abstract lives of 100 years ago. *********** Daylight was fading on November 8 as Harold “Higgie” Higginbottom and his platoon started through the woods in the Argonne. Branches slapped their faces as they pushed through the undergrowth. Their packs were heavy, and it began to rain. There was no path, no road, just a compass guiding them in the dark. Whispers about an armistice had reached all the way to the front. “There was a rumor around today that peace had been declared,” Higgie wrote in his journal. If there was any truth to it, he had yet to see it. Rumors of peace or no, Company B still had a show to carry out. Its next attack was some 15 miles to the north, in an exposed spot across the Meuse River from where the Germans had withdrawn. The trucks had brought them partway, but shells were falling on the road, so the men had to get out of the open and hike undercover. They waded across brooks and swamps and slithered down hills, cursing as they went. Some of the men kept asking the new lieutenant in charge where they were going. One man fell down twice and had trouble getting back up; the other men had to drag him to his feet. They found a road; the mud was knee deep. Arching German flares seemed to be directly overhead, and even though the men knew that the Meuse River lay between the armies, they wondered if they had somehow blundered into enemy territory. Water soaked through Higgie’s boots and socks. When they finally stopped for the night, the undergrowth was so dense it was impossible to camp, so Higgie just rolled himself up in his tent as best he could and huddled on the hillside. As gas attacks began to mark the heaviest and most devastating battles, these brave and brilliant men were on the front lines, racing against the clock-and the Germans-to protect, develop, and unleash the latest weapons of mass destruction. Higgie awoke the next morning in a pool of water. He jumped to his feet, cursing. Mud was everywhere, but at least in daylight they could see their positions and where they were going. He carried bombs up to the advance position, returned for coffee, then made another carry, sliding in the mud. More of the company joined them in carrying mortars up to the front. Higgie had begun to feel better—the hike had warmed him up, and he had found a swell place to camp that night, a spot nestled among trees felled by the Germans. Everyone was cold and wet and caked in mud, but at least Higgie had found a dry spot. When he went to bed, the air was so cold that he and another man kept warm by hugging each other all night. When the frigid morning of November 10 arrived, some of the men lit pieces of paper and tucked them into their frozen boots to thaw them out. Higgie made hot coffee and spread his blankets out to dry. Late that night, the 177th Brigade was going to ford the Meuse, and Higgie’s company was to fire a smoke screen to draw fire away from the advancing infantry. Elsewhere, the Hellfire Regiment had other shows. At 4:00 p.m., Company A shot phosgene at a machine-gun position, forcing the Germans to flee. That night, Company D fired thermite shells over German machine-gun positions about six miles north of Higgie and put up a smoke screen that allowed the Fourth Infantry to cross the Meuse. Higgie rolled himself up in blankets to sleep before the show late that night. But his show was canceled, the infantry forded the river without the smoke screen, and Higgie couldn’t have been happier. He swaddled himself back up in his blanket and went back to bed. Higgie was dead asleep when a private named Charles Stemmerman shook him awake at 4:00 a.m. on November 11. Shells were falling again, and he wanted Higgie to take cover deeper in the forest. Their lieutenant and sergeant had already retreated into the woods. Higgie shrugged off the warning. If the shells got closer, he would move, he told the private. Then he turned over and went back to sleep. He awoke again around 8:00 a.m. The early morning shell barrage had ended. In the light of morning, an impenetrable fog blanketed the forest, so dense that he couldn’t see more than ten feet around him. He got up to make breakfast and prepared for the morning show, a mortar attack with thermite. Then the lieutenant appeared through the mist with the best news Higgie had heard in a long time. All guns would stop firing at 11 o’clock. The Germans had agreed to the Armistice terms. The war had ended. Higgie thought in disbelief that maybe the lieutenant was joking. It seemed too good to be true. He rolled up his pack and retreated deeper into the woods, just to be on the safe side. They had gone through so much, had seen so many things that he would have thought impossible, that he wasn’t going to take any chances now. *********** To the southeast, Tom Jabine’s old Company C was preparing a thermite attack on a German battalion at Remoiville. Zero hour was 10:30 a.m. With 15 minutes to go, the men saw movement across the line. The company watched warily as 100 German soldiers stood up in plain view. As they got to their feet, they thrust their hands into their pockets—a gesture of surrender. An officer clambered up out of the German trench. The Americans watched as he crossed no-man’s-land. The armistice had been signed, the German officer said, and asked that the attack be canceled. Suspecting a trap, the Americans suspended the operation but held their positions, just in case. Minutes later, word arrived from the 11th Infantry. It was true: The armistice had been signed. The war was over. Hundreds of miles away, the sound of whistles and church bells reached Tom Jabine as he lay in his hospital bed in the base in Nantes, where he had arrived a few days earlier. For days after a mustard shell detonated in the doorway of his dugout in October, he had lain in a hospital bed in Langres, inflamed eyes swollen shut, throat and lungs burning. After a time, the bandages had come off, and he could finally see again. He still couldn’t read, but even if he could, letters from home had not followed him to the field hospital. The army had not yet sent official word about his injuries, but after his letters home abruptly stopped, his family back in Yonkers must have feared the worst. In early November, the army transferred him to the base hospital in Nantes. Not a single letter had reached Tom since his injury. He could walk, but his eyes still pained him, and it was difficult to write. More than three weeks after he was gassed, he had been finally able to pick up a pen and write a brief letter to his mother. “I got a slight dose of Fritz’s gas which sent me to the hospital. It was in the battle of the Argonne Forest near Verdun. Well I have been in the hospital ever since and getting a little better every day.” When the pealing from the town spires reached his ears, he reached for pen and paper to write to his mother again. “The good news has come that the armistice has been signed and the fighting stopped. We all hope this means the end of the war and I guess it does. It is hard to believe it is true, but I for one am thankful it is so. When we came over I never expected to see this day so soon if I ever saw it at all,” he wrote. Now, perhaps, he could rejoin his company and go home. “That seems too good to be true but I hope it won’t be long.” *********** Amos Fries was at general headquarters in Chaumont when the news arrived. Later in the day, he drove into Paris in his Cadillac. Shells had fallen just days earlier; now the city erupted in celebration. After four years of bloodshed, euphoria spilled through the city. As Fries waited in his car, a young schoolgirl wearing a blue cape and a hood jumped up on the running board. She stuck her head in the open window and blurted to Fries with glee: “La guerre est fini!” — The war is over! — and then ran on. Of all the sights that day, that was the one Fries recounted in his letter home the next day. “Somehow that sight and those sweet childish words sum up more eloquently than any oration the feeling of France since yesterday at 11 a.m.” As the city roiled in jubilation, a splitting headache sent Fries to bed early. The festivities continued the next day; Fries celebrated with a golf game, then dinner in the evening. “Our war work is done, our reconstruction and peace work looms large ahead. When will I get home? ‘When will we get home?’ is the question on the lips of hundreds of thousands.” *********** Like the turn of the tide, the movement of the American army in the Argonne stopped and reversed, and the men of the gas regiment began retreating south. Hours earlier, the land Higginbottom walked on had been a shooting gallery in a firestorm. Now silence fell over the blasted countryside. For Higgie, the stillness was disquieting after months of earthshaking detonations. He still couldn’t believe the end had come. The company loaded packs on a truck and started hiking to Nouart, about 14 miles south. They arrived in the village at about 5:30 p.m. Higgie went to bed not long after eating. He felt ill after days of unending stress and toil. But he couldn’t sleep. As he lay in the dark with the quiet pressing in around him, he realized that he missed the noise of the guns. He awoke in the morning to the same eerie stillness. After breakfast, he threw his rolled-up pack on a truck and began the 20-mile hike back to Montfaucon. Everything seemed so different now as he retraced his steps. Everything was at a standstill. Nobody knew what to make of things. They arrived at Montfaucon after dark. The moon was bright and the air very cold with a fierce wind blowing. The men set up pup tents on the hilltop, where the shattered ruins of the village overlooked the valley. A month before, German planes had bombed the company as they camped in the lowlands just west of Montfaucon, scattering men and lighting up the encampment with bombs. For months, open fires had been forbidden at the front, to keep the troops invisible in the dark. Now, as Higgie sat on the moonlit hilltop, hundreds of campfires blazed in the valley below. Excerpted from Hellfire Boys: The Birth of the U.S. Chemical Warfare Service and the Race for the World’s Deadliest Weapons. Copyright © 2017 by Theo Emery. Used with permission of Little, Brown and Company, New York. All rights reserved.
4b6c7e5a95adc8508889e4bec960c10f
https://www.smithsonianmag.com/history/how-uss-values-and-american-exceptionalism-have-shaped-our-pandemic-response-180976573/
How the Belief in American Exceptionalism Has Shaped the Pandemic Response
How the Belief in American Exceptionalism Has Shaped the Pandemic Response The spread of the coronavirus in the U.S. is out of control: As of December 1, more than 13.5 million people have been infected nationwide and some 269,000 people have died. Yet many in the U.S. still resist wearing masks in public and even deem mask orders and social distancing guidelines as affronts to their personal freedoms. For political scientists like Deborah Schildkraut of Tufts University in Medford, Massachusetts, the U.S. response to the pandemic can be seen through the lens of American identity. For more than two decades, Schildkraut has been studying what it means to be American, a topic she explored in an article in the Annual Review of Political Science. In it, she wrote that scholars increasingly regard American identity as a social identity, “which refers to the part of a person’s sense of self that derives from his or her membership in a particular group and the value or meaning that he or she attaches to such membership.” According to Schildkraut, at a minimum American identity consists of two sets of norms. One involves an evolving set of beliefs that anyone can follow. These beliefs harken back to Thomas Jefferson and the ideals set forth in the Declaration of Independence (“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are Life, Liberty and the pursuit of Happiness.”) The other set of norms depends on attributes such as one’s race and religion. Knowable Magazine spoke with Schildkraut about the sometimes contradictory attributes Americans consider to be at the core of their national identity, the evolution of these ideas and the impact they have on the country’s ability to confront the pandemic. This conversation has been edited for length and clarity. Why is one’s identity so important? Social psychologists have written about the need to have positive distinctiveness. We like to feel good about the things that we think are unique about us. That drives a lot of in-group and out-group thinking. We like to think good things about the groups that we belong to. It doesn’t always lead to thinking bad things about the groups that we don't belong to, but it easily can. What’s an American identity, and has it evolved over time? Some parts of it haven’t evolved all that much. A lot of the things people think of as being uniquely American are appropriately called aspirational: the idea of individualism, equality of opportunity, self-governance and engaged citizenship. For as long as we’ve been asking people how important certain things are in being American, there’s not been much variation over time in those kinds of things. You see more change over time on issues that are more explicitly about race and ethnicity. There’s this idea of being a nation of immigrants. It’s the American creed: the idea that anybody can become American if they do and believe certain things, and that your country of origin, the language you speak, your religion, all of that is separate from becoming American. It’s crucially tied to the notion of the work ethic and that the opportunities are here for the taking. Of course, we know in practice that hasn’t been true. The aspiration is that race and religion don’t matter. And that anybody can be a true American. We know that in reality, certainly at an unstated level, when people think of what an American is many have an ideal in mind: It’s white, Christian and, honestly, male. The U.S. is an extremely diverse country. How do different groups of people react to these aspirational ideals of individualism, equality of opportunity, self-governance and engaged citizenship? We have done surveys in which we ask people what they think are the important things in making someone a true American. One of the big stories across all the years that we’ve been asking this is that a lot of the variation we see comes down more to party and ideology than it does really to race. There’s actually a lot of agreement on the things that are considered to be most essential such as respecting America’s political institutions and laws and believing in individualism. There’s also considerable agreement on things that are considered less essential, such as the language one speaks, or whether someone was born in the U.S. or has European ancestry. What does individualism mean in this context? Individualism is tied to the notion of minimal government intervention. So that people are free to pursue what they want, with rare exceptions where it may be necessary for the government to intervene so that they don’t inflict harm on others. Does American individualism conflict with other values? Most Americans believe in and want certain values to be prevalent in their lives and they want the government to support them. Some of these key values are freedom, equality and order. Those don’t always go together. And when they conflict—and politics can be thought of as a conflict between these values—the government has to pick one. What’s the effect of these conflicts on the U.S. response to the pandemic? You see the conflicts between freedom and order and freedom and equality playing out now, in how we’re responding to the coronavirus pandemic. People want freedom to be able to go where they want, to not wear a mask if they don’t want to, and that conflicts with the government imposing some kind of order to address the pandemic. We also know that this pandemic has exposed great inequalities and that in places where they are choosing freedom they are not addressing those inequalities, and maybe making them even worse. Other democracies might be more likely to pick equality over freedom when those two conflict; in the U.S., we tend to pick freedom, although there are certainly exceptions. In any society, there’s always going to be some degree of autonomy that people have to give up in order for society to function, for us to live as a collective. What type of autonomy are you willing to give up? When are you willing to give it up? In the U.S., nobody bats an eye at the idea that we all have to stop at red lights on the road, even though that’s an infringement on our freedoms. But any time it’s something new that we’re not already used to, there will be resistance to it. There’s also a deep distrust among Americans towards government, and they often do not believe that government will execute programs efficiently or use its resources responsibly. Compared with other countries, we also have the complexity of federalism where we value devolving power to the states in some areas, but not others. And people like to celebrate their state identities. Part of our national character is the immense variation across the states, and all that feeds into our response to the pandemic. Have other countries demonstrated a tendency to put equality before freedom and does that influence the policies they pursue? Countries that have multiparty systems, where there might be a stronger Labor Party, or a Democratic Socialist Party, where you have a stronger history of a welfare state, places that have national health care systems, for example—those are all evidences of greater government intervention and less reliance on people going it alone and figuring it out for themselves. In those countries, there’s an acceptance that government intervention is something of value so that there’s some equity and equality, and that the government is going to play a bigger role to ensure some minimum quality of life. How else can one understand the U.S. response to the pandemic, seen from the perspective of American identity? I don’t pretend to have the answers. There’s one thing that has long been puzzling to me: President Trump’s insistence that this is not a big deal. At least initially, where there were lockdowns, there was this real sense of national purpose and community. People were applauding health care workers in the streets and putting up teddy bears in their windows for kids to go on scavenger hunts in their neighborhoods. There was this sense of solidarity that didn’t really last very long. We know from a lot of political science research that elite rhetoric (meaning messages coming from prominent elected officials) can be really powerful. Once a politician decides to take a certain line—that this is not a big deal, places should be able to do what they want, we should prioritize freedom and so on—it’s not that surprising that many Americans would follow suit and prioritize that interpretation of American identity as well. Can that messaging be changed? There’s a lot of potential for leadership here to frame this in terms of national sacrifice: that this is who we are as Americans and we can find ways to come together to solve this. Joe Biden is now President-elect. Do you foresee a sea change in how the U.S. will respond to this pandemic, because of the messaging that might come from his administration? I would hope so. But I’m not particularly optimistic, because while Trump has clearly been the leader of his party and the leader of the country during this time, he could really only have been successful with the support of the Republican Party. And all of those other politicians who either repeated what he said or didn’t contradict it are still going to be there. One thing Trump certainly demonstrated is that you can do a lot with the executive powers of the presidency. And so even if Biden doesn’t get a lot of cooperation from Congress, there are lots of things he can do on his own with the executive branch. In terms of this idea that we are facing this national crisis, wouldn’t it be great if there was a sense of common purpose and common identity? We know that elite messaging can matter. And hopefully, there are enough people who are either already predisposed to support Biden’s messaging or just fed up with politics and conflict, that it would make them receptive to that kind of messaging. A cynic would say that politicians are manufacturing identities and then manipulating them. Is that possible? Oh, it’s definitely possible. It may be a strategy that’s helpful for winning in the short term, but isn’t necessarily in a political party’s long-term interest. We think of this a lot with the contemporary Republican Party. They may be trying to increase the salience of a white identity, for example. In the short term, this may be a winning strategy in enough places for the Republican Party, but it’s not going to be a long-term strategy as the population continues to change. Is that because the notion of what it means to be American is somehow changing because of increasing diversity and immigration? That’s right. The younger generation today, which will be the dominant makeup of voters in the not too distant future, is much more diverse. Whether they are going to find a campaign that capitalizes on white racial anxiety attractive or not remains to be seen, but it’s going to be harder than it is now. What have the last nine months been like for you, personally and professionally? A group of us political scientists joke—a kind of gallows humor—that some of these really bad things that are happening are great for political science. People who study anxiety and people who study anger and its political effects are getting great data. The problem is, none of us have time to actually do the research, because we’re all home with our kids. And that’s a concern, because political scientists can contribute to our understanding of a lot of big problems. This article is part of Reset: The Science of Crisis & Recovery, an ongoing series exploring how the world is navigating the coronavirus pandemic, its consequences and the way forward. Reset is supported by a grant from the Alfred P. Sloan Foundation. This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews. Sign up for the newsletter.
81c6b4c10c9f54a58cc088798041b16d
https://www.smithsonianmag.com/history/how-woodrow-wilsons-propaganda-machine-changed-american-journalism-180963082/
How Woodrow Wilson’s Propaganda Machine Changed American Journalism
How Woodrow Wilson’s Propaganda Machine Changed American Journalism When the United States declared war on Germany 100 years ago, the impact on the news business was swift and dramatic. In its crusade to “make the world safe for democracy,” the Wilson administration took immediate steps at home to curtail one of the pillars of democracy – press freedom – by implementing a plan to control, manipulate and censor all news coverage, on a scale never seen in U.S. history. Following the lead of the Germans and British, Wilson elevated propaganda and censorship to strategic elements of all-out war. Even before the U.S. entered the war, Wilson had expressed the expectation that his fellow Americans would show what he considered “loyalty.” Immediately upon entering the war, the Wilson administration brought the most modern management techniques to bear in the area of government-press relations. Wilson started one of the earliest uses of government propaganda. He waged a campaign of intimidation and outright suppression against those ethnic and socialist papers that continued to oppose the war. Taken together, these wartime measures added up to an unprecedented assault on press freedom. I study the history of American journalism, but before I started researching this episode, I had thought that the government’s efforts to control the press began with President Roosevelt during WWII. What I discovered is that Wilson was the pioneer of a system that persists to this day. All Americans have a stake in getting the truth in wartime. A warning from the WWI era, widely attributed to Sen. Hiram Johnson, puts the issue starkly: “The first casualty when war comes is truth.” Within a week of Congress declaring war, on April 13, 1917, Wilson issued an executive order creating a new federal agency that would put the government in the business of actively shaping press coverage. That agency was the Committee on Public Information, which would take on the task of explaining to millions of young men being drafted into military service – and to the millions of other Americans who had so recently supported neutrality – why they should now support war. The new agency – which journalist Stephen Ponder called “the nation’s first ministry of information” – was usually referred to as the Creel Committee for its chairman, George Creel, who had been a journalist before the war. From the start, the CPI was “a veritable magnet” for political progressives of all stripes – intellectuals, muckrakers, even some socialists – all sharing a sense of the threat to democracy posed by German militarism. Idealistic journalists like S.S. McClure and Ida Tarbell signed on, joining others who shared their belief in Wilson’s crusade to make the world safe for democracy. At the time, most Americans got their news through newspapers, which were flourishing in the years just before the rise of radio and the invention of the weekly news magazine. In New York City, according to my research, nearly two dozen papers were published every day – in English alone – while dozens of weeklies served ethnic audiences. Starting from scratch, Creel organized the CPI into several divisions using the full array of communications. The Speaking Division recruited 75,000 specialists who became known as “Four-Minute Men” for their ability to lay out Wilson’s war aims in short speeches. The Film Division produced newsreels intended to rally support by showing images in movie theaters that emphasized the heroism of the Allies and the barbarism of the Germans. The Foreign Language Newspaper Division kept an eye on the hundreds of weekly and daily U.S. newspapers published in languages other than English. Another CPI unit secured free advertising space in American publications to promote campaigns aimed at selling war bonds, recruiting new soldiers, stimulating patriotism and reinforcing the message that the nation was involved in a great crusade against a bloodthirsty, antidemocratic enemy. Some of the advertising showed off the work of another CPI unit. The Division of Pictorial Publicity was led by a group of volunteer artists and illustrators. Their output included some of the most enduring images of this period, including the portrait by James Montgomery Flagg of a vigorous Uncle Sam, declaring, “I WANT YOU FOR THE U.S. ARMY!” ********** Other ads showed cruel “Huns” with blood dripping from their pointed teeth, hinting that Germans were guilty of bestial attacks on defenseless women and children. “Such a civilization is not fit to live,” one ad concluded. Creel denied that his committee’s work amounted to propaganda, but he acknowledged that he was engaged in a battle of perceptions. “The war was not fought in France alone,” he wrote in 1920, after it was all over, describing the CPI as “a plain publicity proposition, a vast enterprise in salesmanship, the world’s greatest adventure in advertising.” For most journalists, the bulk of their contact with the CPI was through its News Division, which became a veritable engine of propaganda on a par with similar government operations in Germany and England but of a sort previously unknown in the United States. In the brief year and a half of its existence, the CPI’s News Division set out to shape the coverage of the war in U.S. newspapers and magazines. One technique was to bury journalists in paper, creating and distributing some 6,000 press releases – or, on average, handing out more than 10 a day. The whole operation took advantage of a fact of journalistic life. In times of war, readers hunger for news and newspapers attempt to meet that demand. But at the same time, the government was taking other steps to restrict reporters’ access to soldiers, generals, munitions-makers and others involved in the struggle. So, after stimulating the demand for news while artificially restraining the supply, the government stepped into the resulting vacuum and provided a vast number of official stories that looked like news. Most editors found the supply irresistible. These government-written offerings appeared in at least 20,000 newspaper columns each week, by one estimate, at a cost to taxpayers of only US$76,000. In addition, the CPI issued a set of voluntary “guidelines” for U.S. newspapers, to help those patriotic editors who wanted to support the war effort (with the implication that those editors who did not follow the guidelines were less patriotic than those who did). The CPI News Division then went a step further, creating something new in the American experience: a daily newspaper published by the government itself. Unlike the “partisan press” of the 19th century, the Wilson-era Official Bulletin was entirely a governmental publication, sent out each day and posted in every military installation and post office as well as in many other government offices. In some respects, it is the closest the United States has come to a paper like the Soviet Union’s Pravda or China’s People’s Daily. The CPI was, in short, a vast effort in propaganda. The committee built upon the pioneering efforts of public relations man Ivy Lee and others, developing the young field of public relations to new heights. The CPI hired a sizable fraction of all the Americans who had any experience in this new field, and it trained many more. One of the young recruits was Edward L. Bernays, a nephew of Sigmund Freud and a pioneer in theorizing about human thoughts and emotions. Bernays volunteered for the CPI and threw himself into the work. His outlook – a mixture of idealism about the cause of spreading democracy and cynicism about the methods involved – was typical of many at the agency. “The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society,” Bernays wrote a few years after the war. “Propaganda is the executive arm of the invisible government.” All in all, the CPI proved quite effective in using advertising and PR to instill nationalistic feelings in Americans. Indeed, many veterans of the CPI’s campaign of persuasion went into careers in advertising during the 1920s. The full bundle of techniques pioneered by Wilson during the Great War were updated and used by later presidents when they sent U.S. forces into battle. Christopher B. Daly, Professor of Journalism, Boston University
1f9cf6510b07a9353c3836138fee16fe
https://www.smithsonianmag.com/history/how-writers-timeless-mined-history-its-riveting-second-season-180969070/
How the Writers of “Timeless” Mined History for its Riveting Second Season
How the Writers of “Timeless” Mined History for its Riveting Second Season It’s not every TV season that a show comes along that fits so snugly into Smithsonian.com’s wheelhouse. That’s why we were so excited when “Timeless” got picked up for a second season, giving us self-professed history nerds a chance to geek out on the show and learn a few things in the process. The show took us to pivotal moments in American history, showing the fight for women’s suffrage, the birth of Delta Blues, the romance of classic Hollywood. And while we were “edu-tained,” we were also entertained, falling for the budding romance between Lucy and Wyatt (alas), cheering for Connor Mason’s redemption, and watching Rufus and Jiya grow closer together. As viewers now know (spoilers ahead, naturally), though, not all is well with the Time Team: Rittenhouse is still a going concern, now run by even more ruthless villains, and worst of all, Rufus is dead in 1888. Luckily, the team has an even-more-upgraded time machine—and a buffer, Tomb-Raidery #lyatt—so not all hope is lost, but we’ll have to hang on for a possible Season Three (NBC has not yet announced whether the show will be renewed) to see how that turns out. For now, though, this ends our foray into TV recapping. But as one last hurrah before we go, we convinced show co-creator Shawn Ryan (“The Shield,” “SWAT,” “Terriers”) to sit down with us and, for a short while, get just as nerdy about history as we are. So...Rufus. I don’t want to say NBC is holding Rufus hostage, but there you have it. How do you come up with the scenarios for your episodes? Do you start with a time period, or a character, or a story? Sometimes there’s a time period or a person that’s of such interest to us, we say, ‘We have to find a way to do an episode. [Co-creator] Eric [Kripke] has always wanted to do an episode about Robert Johnson. That was one that took a while for us to figure out, what’s the story around it? We centered it around Connor Mason and his first trip back into the past. Other times there’s a certain genre of show we want to do, so in Season One, we knew we wanted to do a spy story, behind enemy lines in Germany during World War II. We didn’t know if there was anyone historically significant, so we sent David [Hoffman, the show’s history consultant and one of the writers] off and asked, ‘Who plausibly would have been there?’ He came back with, ‘Did you know Ian Fleming was a spy?’ Other times we actually start with the emotional stories of the episode and use that as guidance for what historical period we might want to visit. So when Lucy and Wyatt are sort of falling for each other in episode three, before Jessica shows up, it’s terribly romantic and we wanted the height of romanticism. What’s more romantic than classic Hollywood? So sometimes the time period and the historical people come last. Sometimes they come first. Talk about a time where you said, I know this is historically inaccurate, but we’re putting it in because it makes for better TV. We try not to do that. One example that I can think of is in Season One. I think we were told that Katherine Johnson was not at NASA on the day of the moon landing. We certainly could not find definitive proof that she was there in the building on the day, and so we were faced with, ‘Well, do we abandon the story, or do we tell a sort of more general truth, the importance of who she was?’ But we usually try to avoid that, we try to be as true as we can. It seems like you’re putting a lot of effort into telling the stories of women and people of color this season. Was that intentional? Was it more difficult? One of the things we became interested in for Season Two is the historical figures you should know more about, rather than Jesse James or other people you really did know about. We were more interested in the Alice Pauls of the world. Obviously, there’s always more source material on somebody like JFK than there’s going to be on Alice Paul. Abby Franklin, when you go back to the 1600s, it’s kind of tough. But in some ways it gives you a little more freedom as writers. We’re always looking for a historical fact that contradicts what we want to do, and if there’s not, it gives us a little more wiggle room. So much of history as it’s taught revolves around powerful white men, and one of the things that was of great interest to us this year was to see if there was a way to explore history beyond that. What are the biggest logistical challenges in putting together a show that spans so many time periods? It’s brutal. I don’t know how else to put it. Eric and I look at ourselves and say what did we do here? It’s almost impossible to make a historical drama every week. It’s difficult to make a sci-fi show. And we’ve chosen to do both. I don’t know how Mari-An Ceo, our costume designer, I don’t know how she does it. A lot of times she has to make the costumes, because they’re not available to rent. Locations are also a big thing. We had a couple episodes that got out of control in Season One—we built the Alamo. We were under tighter financial constraints in Season Two. We really did get good at having one or two big set pieces that show off the time period in ways that sell the world, and then finding ways in other parts of the story to be in rooms that were more easily cheatable. What are your favorite time-travel movies or TV shows? Back to the Future was always a big one for me. That came out when I was a teenager, and that had a lot of impact. A different kind of time-travel story, The Terminator, was another big one. I always liked “Quantum Leap” when it was on. Eric is the huge sci-fi fan in our partnership…he talks about “Time Tunnel” as a show that impacted him. It’s like..late ’60s, and there’s a time tunnel. It was a little ahead of its time. Do you have a unified theory of time travel for the show? We have our rules. What I’ve learned is that the fans care so much about them, and you’ve got to be super careful. Our rules are there’s this tree trunk of time, and then with the time machine, if someone goes back to an earlier moment the tree trunk can grow in a different direction. So it’s not really a multiverse so much as there’s one thread, and the thread can be changed. So those people who go back in the past and something changes, when they get back those people remember what it used to be. We do have different people on the show who have different memories of different histories. Really, we’re a historical drama show. We’re not trying to dig deep into quantum physics. What stories do you still want to tell that you haven't yet had a chance to? There’s a bunch, but until we match them up with characters we don’t know if we can do them. One thing i learned about my own genealogy is I am a descendant of two different Orphan Train kids, so I’m really interested in doing a story about that. Teddy Roosevelt’s a really interesting figure… Not even necessarily during his presidential years. I’ve always been interested in doing something around the labor movement…I think there’s interesting labor history we haven’t touched yet. If you had access to the Lifeboat, when and where would you take it? I’ve been asked this question a few times, and I always give a different answer. Sometimes I just want to see fantastic sporting events. Sometimes I think I would actually go back and try to change things, but then I would worry about the unintended consequences. This time I’ll say: I would take it and go to the Globe Theater and watch some original Shakespeare.
52420632f0aebba4a84d73b906981fca
https://www.smithsonianmag.com/history/howard-hughes-h-1-carried-him-all-the-way-89476521/
Howard Hughes’ H-1 Carried Him “All the Way”
Howard Hughes’ H-1 Carried Him “All the Way” The object at hand is silver and imperially slim, a fast and famous airplane. And not merely fast and famous either, but probably the most beautiful airplane ever built. Its wings fair into the fuselage with such a smooth and gracious curve that you can almost feel the air just sliding by with no friction. It is the Hughes 1-B racer, better known as the H-1, which is on view these days in the Smithsonian's National Air and Space Museum. In 1935, it set the world record for landplanes—at the then astonishing speed of 352.388 miles per hour. Sixteen months later, it flew nonstop from Burbank, California, to New Jersey's Newark Airport in 7 hours 28 minutes. As sleek and gleaming as Brancusi's famous Bird in Space, the H-1 may represent a pure marriage of form and function. But like many valuable and worldly objects, it was a product of money and ambition. The man who both flew it to fame and was responsible for its creation was Howard Hughes. In those innocent, far-off times Hughes was what was known as a "young sportsman." Born in 1905, he had, at 30, already taken over his father's tool company, made millions of dollars, sashayed around with a veritable Milky Way of movie starlets, and produced and directed Hell's Angels, the classic film of aerial death and dogfightery in World War I. Hughes was a man with a lifelong penchant for films, fast planes and beautiful women. Few begrudged him these preoccupations, even when his production of The Outlaw showed a good deal more of Jane Russell's facade than was then thought proper. But his private phobias about germs and secrecy were something else again. To recent generations he is mainly known as the pitiful, paranoid billionaire he became, a terminally ill, grotesque recluse who tried to control vast holdings from beleaguered rooftop quarters in places like Las Vegas and Jamaica. He had a world-class gift for taking umbrage—and for giving it. But in the air-minded 1930s, Hughes, who was Hollywood-handsome, rich as Croesus and a gifted dabbler in aeronautical engineering, was deservedly some kind of hero. He was brave, even foolhardy. His H-1 not only smashed records but broke new ground in aircraft design. He went on to pilot a standard, twin-ruddered and twin-engined Lockheed 14 around the world in a little more than 91 hours. It was not only a world record but a pioneer flight that paved the way for the infant commercial airline services, one of which, TWA, he later owned and ran. From the moment Hughes decided to make Hell's Angels he became a passionate flier. During the actual filming, when his hired stunt pilots refused to try a chancy maneuver for the cameras, Hughes did it himself, crash-landing in the process. He celebrated his 31st birthday by practicing touch-and-go landings in a Douglas DC-2. He also kept acquiring all sorts of aircraft to practice with and every one he got he wanted to redesign in some way. "Howard," a friend finally told him, "you'll never be satisfied until you build your own." The H-1 racer was the result. In the early '30s Hughes had hired an ace aeronautical engineer named Richard Palmer and a skilled mechanic and production chief, Glenn Odekirk. In 1934 they set to work in a shed in Glendale, California. Hughes' aim was not only "to build the fastest plane in the world" but to produce something that might recommend itself to the Army Air Corps as a fast pursuit plane. It was the right moment. The threat of World War II loomed in Spain and China; every year at the Thompson Trophy races in Cleveland, the country cheered the record-breaking exploits of hot little planes flown by the likes of Jimmy Doolittle and Roscoe Turner. Speed records had increased at a rate of about 15 mph a year since 1906, when Brazilian pilot Alberto Santos-Dumont set the first record, in France, at 25.66 mph. A few planes were of bizarre design, like the Gee Bee Sportster, which resembled a fireplug with cupid wings. Some had outsize radial engines (with cylinders set like spokes on a wheel). Others were pointy-nosed, like France's black Caudron racer with its sleek in-line engine. A Caudron set the 1934 speed record at 314.319 mph. In-line engines were more streamlined; radial engines ran cooler and gave less mechanical trouble. Hughes chose a Twin Wasp Junior by Pratt & Whitney, which could produce 900 hp if properly fed on 100-octane gas. It was a radial but small (only 43 inches in diameter), housed in a long, bell-shaped cowling to cut down drag. In building the H-1, cutting down drag became a cause celebre. Its plywood-covered wings were short (with a span of only 24 feet 5 inches) and had been sanded and doped until they looked like glass. The thousands of rivets used on the surface of its aluminum monocoque fuselage were all countersunk, their heads partly sheered off and then burnished and polished to make a perfectly smooth skin. Every screw used on the plane's surface was tightened so that the slot was exactly in line with the airstream. The racer's landing gear, the first ever to be raised and lowered by hydraulic pressure rather than cranked by hand, folded up into slots in the wings so exactly that even the outlines could scarcely be seen. Sometimes, Hughes would be intimately involved with the work. Sometimes, he'd be off, buying or renting new planes to practice with, acquiring a huge yacht (which he practically never used), dating movie stars like Katharine Hepburn and Ginger Rogers. By August 10, 1935, the H-1 was finished. On the 17th, Hughes flew the dream plane for 15 minutes and landed. "She flies fine," he growled to Odekirk. "Prop's not working though. Fix it." He scheduled the official speed trial at Santa Ana down in Orange County for Thursday the 12th of September. Speed trials, under the aegis of the International Aeronautical Federation (FAI) in Paris, measured the best of four electrically timed passes over a three-kilometer course at no more than 200 feet above sea level. The contestant was allowed to dive into each pass, but from no higher than 1,000 feet. And for a record to be set, the plane had to land afterward with no serious damage. Darkness fell on the 12th before an official trial could be recorded. On Friday the 13th, no less a figure than Amelia Earhart turned up, officially flying cover at 1,000 feet to be sure Hughes stayed within the rules. Watched by a flock of experts on the ground, the H-1 took off, flew back over beet and bean and strawberry fields, dove to 200 feet and made its runs. To reduce weight the plane carried enough gas for five or six runs, but instead of landing, Hughes tried for a seventh. Starved for fuel, the engine cut out. The crowd watched in stunned silence under a suddenly silent sky. With stubby wings and high wing-loading (the ratio between a plane's lifting surfaces and its weight), the H-1 was not highly maneuverable even with power. Characteristically cool, Hughes coaxed the plane into position over a beet field and eased in for a skillful, wheels-up belly landing. Though the prop blades got folded back over the cowling like the ends of a necktie in a howling wind, the fuselage was only slightly scraped. The record stood. At 352.388 mph the H-1 had left the Caudron's record in the dust. "It's beautiful," Hughes told Palmer. "I don't see why we can't use it all the way." "All the way" meant nonstop across America. The H-1 had cost Hughes $105,000 so far. Now it would cost $40,000 more. Palmer and Odekirk set to work, designing a longer set of wings-for more lift. They installed navigational equipment, oxygen for high-altitude flying, new fuel tanks in the wings to increase capacity to 280 gallons. Hughes practiced cross-country navigation and bad-weather flying, buying a succession of planes and renting a Northrop Gamma from the famous air racer Jacqueline Cochrane. By late December 1936, the H-1 was ready again. Hughes tried it out for a few hours at a time, checking his fuel consumption after each flight. On January 18, 1937, after only 1 hour 25 minutes in the air, he landed, and he and Odekirk stood beside the ship, making calculations. Their figures tallied. "At that rate," said Hughes, "I can make New York. Check her over and make the arrangements. I'm leaving tonight." Odekirk objected. So did Palmer, by phone from New York. The plane had no night-flight instruments. But there was nothing to be done. "You know Howard," Odekirk shrugged. That night Hughes did not bother with sleep. Instead he took a date to dinner, dropped her off at home after midnight, caught a cab to the airport, checked the weather reports over the Great Plains, climbed into a flight suit and took off. The hour was 2:14 a.m., a time when he was accustomed to doing some of his best "thinking." He rocketed eastward at 15,000 feet and above, using oxygen, riding the airstream at speeds faster than the sprints done that year by the Thompson Trophy racers at Cleveland. The tiny silver pencil of a plane touched down at Newark at 12:42 p.m., just in time for lunch. It had taken 7 hours 28 minutes 25 seconds, at an average speed of 327.1 mph. That record stood until 1946, to be broken by stunt pilot Paul Mantz in a souped-up World War II P-51 Mustang. Hughes went on to live an extraordinary and ultimately tragic life, one that made a different sort of headline. He founded a great electronics company and gave millions to medical research. During World War II he designed the Spruce Goose, a huge plywood flying boat that was derided in part because when it was ready, the country no longer needed it. And he died wretched. After landing in Newark, the H-1 simply sat for nearly a year and was finally flown back to California by someone else. Hughes eventually sold it, then bought it back. But he never flew the H-1 again. He was proud of it, though. He noted several times that its success had encouraged the development of the great radial-engine fighters of World War II-America's P-47 Thunderbolt and Grumman Hellcat, Germany's Focke-Wulf FW 190 and Japan's Mitsubishi Zero. When, in 1975, shortly before his death, he gave the H-1 to the Smithsonian, the plane had been flown for only 40.5 hours, less than half of that by Howard Hughes.
1a559c3ffb0093d281377a7b401e8d66
https://www.smithsonianmag.com/history/hypatia-ancient-alexandrias-great-female-scholar-10942888/
Hypatia, Ancient Alexandria’s Great Female Scholar
Hypatia, Ancient Alexandria’s Great Female Scholar One day on the streets of Alexandria, Egypt, in the year 415 or 416, a mob of Christian zealots led by Peter the Lector accosted a woman’s carriage and dragged her from it and into a church, where they stripped her and beat her to death with roofing tiles. They then tore her body apart and burned it. Who was this woman and what was her crime? Hypatia was one of the last great thinkers of ancient Alexandria and one of the first women to study and teach mathematics, astronomy and philosophy. Though she is remembered more for her violent death, her dramatic life is a fascinating lens through which we may view the plight of science in an era of religious and sectarian conflict. Founded by Alexander the Great in 331 B.C., the city of Alexandria quickly grew into a center of culture and learning for the ancient world. At its heart was the museum, a type of university, whose collection of more than a half-million scrolls was housed in the library of Alexandria. Alexandria underwent a slow decline beginning in 48 B.C., when Julius Caesar conquered the city for Rome and accidentally burned down the library. (It was then rebuilt.) By 364, when the Roman Empire split and Alexandria became part of the eastern half, the city was beset by fighting among Christians, Jews and pagans. Further civil wars destroyed much of the library’s contents. The last remnants likely disappeared, along with the museum, in 391, when the archbishop Theophilus acted on orders from the Roman emperor to destroy all pagan temples. Theophilus tore down the temple of Serapis, which may have housed the last scrolls, and built a church on the site. The last known member of the museum was the mathematician and astronomer Theon—Hypatia’s father. Some of Theon’s writing has survived. His commentary (a copy of a classical work that incorporates explanatory notes) on Euclid’s Elements was the only known version of that cardinal work on geometry until the 19th century. But little is known about his and Hypatia’s family life. Even Hypatia’s date of birth is contested—scholars long held that she was born in 370 but modern historians believe 350 to be more likely. The identity of her mother is a complete mystery, and Hypatia may have had a brother, Epiphanius, though he may have been only Theon’s favorite pupil. Theon taught mathematics and astronomy to his daughter, and she collaborated on some of his commentaries. It is thought that Book III of Theon’s version of Ptolemy’s Almagest—the treatise that established the Earth-centric model for the universe that wouldn’t be overturned until the time of Copernicus and Galileo—was actually the work of Hypatia. She was a mathematician and astronomer in her own right, writing commentaries of her own and teaching a succession of students from her home. Letters from one of these students, Synesius, indicate that these lessons included how to design an astrolabe, a kind of portable astronomical calculator that would be used until the 19th century. Beyond her father’s areas of expertise, Hypatia established herself as a philosopher in what is now known as the Neoplatonic school, a belief system in which everything emanates from the One. (Her student Synesius would become a bishop in the Christian church and incorporate Neoplatonic principles into the doctrine of the Trinity.) Her public lectures were popular and drew crowds. “Donning [the robe of a scholar], the lady made appearances around the center of the city, expounding in public to those willing to listen on Plato or Aristotle,” the philosopher Damascius wrote after her death. Hypatia never married and likely led a celibate life, which possibly was in keeping with Plato’s ideas on the abolition of the family system. The Suda lexicon, a 10th-century encyclopedia of the Mediterranean world, describes her as being “exceedingly beautiful and fair of form. . . in speech articulate and logical, in her actions prudent and public-spirited, and the rest of the city gave her suitable welcome and accorded her special respect.” Her admirers included Alexandria’s governor, Orestes. Her association with him would eventually lead to her death. Theophilus, the archbishop who destroyed the last of Alexandria’s great Library, was succeeded in 412 by his nephew, Cyril, who continued his uncle’s tradition of hostilities toward other faiths. (One of his first actions was to close and plunder the churches belonging to the Novatian Christian sect.) With Cyril the head of the main religious body of the city and Orestes in charge of the civil government, a fight began over who controlled Alexandria. Orestes was a Christian, but he did not want to cede power to the church. The struggle for power reached its peak following a massacre of Christians by Jewish extremists, when Cyril led a crowd that expelled all Jews from the city and looted their homes and temples. Orestes protested to the Roman government in Constantinople. When Orestes refused Cyril’s attempts at reconciliation, Cyril’s monks tried unsuccessfully to assassinate him. Hypatia, however, was an easier target. She was a pagan who publicly spoke about a non-Christian philosophy, Neoplatonism, and she was less likely to be protected by guards than the now-prepared Orestes. A rumor spread that she was preventing Orestes and Cyril from settling their differences. From there, Peter the Lector and his mob took action and Hypatia met her tragic end. Cyril’s role in Hypatia’s death has never been clear. “Those whose affiliations lead them to venerate his memory exonerate him; anticlericals and their ilk delight in condemning the man,” Michael Deakin wrote in his 2007 book Hypatia of Alexandria. Meanwhile, Hypatia has become a symbol for feminists, a martyr to pagans and atheists and a character in fiction. Voltaire used her to condemn the church and religion. The English clergyman Charles Kingsley made her the subject of a mid-Victorian romance. And she is the heroine, played by Rachel Weisz, in the Spanish movie Agora, which will be released later this year in the United States. The film tells the fictional story of Hypatia as she struggles to save the library from Christian zealots. Neither paganism nor scholarship died in Alexandria with Hypatia, but they certainly took a blow. “Almost alone, virtually the last academic, she stood for intellectual values, for rigorous mathematics, ascetic Neoplatonism, the crucial role of the mind, and the voice of temperance and moderation in civic life,” Deakin wrote. She may have been a victim of religious fanaticism, but Hypatia remains an inspiration even in modern times. Sarah Zielinski is an award-winning science writer and editor. She is a contributing writer in science for Smithsonian.com and blogs at Wild Things, which appears on Science News.
1720485e33fded5b90641ac1103222fc
https://www.smithsonianmag.com/history/i-hope-it-not-too-late-180963640/
World War I: 100 Years Later
World War I: 100 Years Later U.S. General John J. Pershing, newly arrived in France, visited his counterpart, French general Philippe Pétain, with a sobering message on June 16, 1917. It had been two months since the U.S. entered World War I, but Pershing, newly appointed to command the American Expeditionary Force in France, had hardly any troops to deploy. The United States, Pershing told Pétain, wouldn’t have enough soldiers to make a difference in France until spring 1918. “I hope it is not too late,” the general replied. Tens of thousands of Parisians had thronged the streets to cheer Pershing on his June 13 arrival. Women climbed onto the cars in his motorcade, shouting, “Vive l’Amérique!” The French, after three years of war with Germany, were desperate for the United States to save them. Now Pétain told Pershing that French army was near collapse. A million French soldiers had been killed in trench warfare. Robert-Georges Nivelle’s failed April offensive against the German line in northern France had caused 120,000 French casualties. After that, 750,000 soldiers mutinied, refusing to go to the front line. Pétain, who replaced Nivelle in May, had kept the army together by granting some of the soldiers’ demands for better food and living conditions and leave to see their families. But the French were in no condition to launch any more offensives. “We must wait for the Americans,” Pétain told Pershing. But the United States wasn’t ready to fight. It had declared war in April 1917 with only a small standing army. Pershing arrived in France just four weeks after the Selective Service Act authorized a draft of at least 500,000 men. Though President Woodrow Wilson intended to send troops to France, there was no consensus on how many. “The more serious the situation in France,” Pershing wrote in his 1931 memoir, My Experiences in the World War, “the more deplorable the loss of time by our inaction at home appeared.” It fell to Pershing to devise the American war strategy. The 56-year-old West Point graduate had fought the Apache and Sioux in the West, the Spanish in Cuba, Filipino nationalists in their insurrection against U.S. rule and Pancho Villa in Mexico. He was blunt, tough, and stubborn—“a large man with small, trim arms and legs, and an underslung jaw that would defy an aerial bomb,” a contemporary wrote. He hated dithering, spoke little and hardly ever smiled. Resisting French and British pressure to reinforce their armies with American soldiers, Pershing and his aides studied where to best deploy the American Expeditionary Force. Germany had seized nearly all of Belgium and the northeast edge of France, so the war’s Western front now stretched 468 miles, from the Swiss border to the North Sea. The British were deployed in France’s northern tip, where they could quickly escape home if they had to. The French were defending Paris by holding the front about 50 miles northeast of the capital. So Pershing chose Lorraine, in northeastern France, as “a chance for the decisive use of our army.” If the Americans could advance just 40 miles from there, they could reach Germany itself, cut off the main German supply line, and threaten the enemy’s coalfields and iron mines. On June 26, Pershing visited Pétain again, and tentatively agreed on where to begin the first American offensive. On June 28, the first 14,500 American troops arrived in France. “Their arrival left Pershing singularly unimpressed,” wrote Jim Lacey in his 2008 biography, Pershing. “To his expert eye the soldiers were undisciplined and poorly trained. Many of their uniforms did not fit and most were fresh from recruiting stations, with little training other than basic drill.” But Parisians wanted to throw a gala celebration for the troops on America’s Independence Day. To boost French morale, Pershing reluctantly agreed. On July 4, he and the troops marched five miles through Paris’ streets to the tomb of the Marquis de Lafayette. There, Pershing aide Charles E. Stanton delivered a speech that ended with a sweeping salute. “Nous voilà, Lafayette!” Stanton declared—“Lafayette, we are here!” in English—a phrase often misattributed to Pershing himself. Ceremonies performed, Pershing got back to work. The British and French counted on 500,000 U.S. troops in 1918. But Pershing suspected a half-million soldiers wouldn’t be enough. His three weeks in France had deepened his understanding of the Allies’ plight and their inability to break the stalemate on the Western Front. America, he decided, needed to do more. On July 6, Pershing cabled Newton Baker, the Secretary of War. “Plans should contemplate sending over at least 1,000,000 men by next May,” the telegram read. Soon after, Pershing and his aides forwarded a battle plan to Washington. It called for a larger military effort than the United States had ever seen. “It is evident that a force of about 1,000,000 is the smallest unit which in modern war will be a complete, well-balanced, and independent fighting organization,” Pershing wrote. And plans for the future, he added, might require as many as 3 million men. Pershing’s demand sent shock waves through the War Department. Admiral William Sims, who commanded the U.S. fleet in European waters, thought Pershing was joking when he heard it. Tasker Bliss, the War Department’s acting chief of staff, expressed alarm, but had no alternate plan. “Baker seemed unruffled,” wrote Frank E. Vandiver in his 1977 Pershing biography, Black Jack. “Committed to winning peace at any kind of rates, Wilson followed Baker’s calm.” They accepted Pershing’s war plan. Almost 10 million young men had already registered for the draft, giving the Wilson administration the means to fulfill Pershing’s demand. On July 20, Baker, wearing a blindfold, pulled numbers out of a glass bowl, choosing 687,000 men in the nation’s first draft lottery since the Civil War. By the end of July, the outlines of the war effort’s true scale—1 to 2 million men—began to emerge in the press. But the news didn’t reverse public and congressional support for the war. The shock of the Zimmermann Telegram and the patriotic exhortations of the government’s Committee on Public Information had overcome many Americans’ past skepticism about sending troops to fight in Europe. By the end of 1918, the United States would draft 2.8 million men into the armed forces—just in time to help its allies win the war. Erick Trickey is a writer in Boston, covering politics, history, cities, arts, and science. He has written for POLITICO Magazine, Next City, the Boston Globe, Boston Magazine, and Cleveland Magazine
63969b966ab273994ea642e749696dc5
https://www.smithsonianmag.com/history/if-theres-a-man-among-ye-the-tale-of-pirate-queens-anne-bonny-and-mary-read-45576461/?no-ist
If There’s a Man Among Ye: The Tale of Pirate Queens Anne Bonny and Mary Read
If There’s a Man Among Ye: The Tale of Pirate Queens Anne Bonny and Mary Read Last week Mike Dash told a tale of high seas adventure that put me in mind of another, somewhat earlier one. Not that Anne Bonny and Mary Read had much in common with kindly old David O’Keefe—they were pirates, for one thing, as renowned for their ruthlessness as for their gender, and during their short careers challenged the sailors’ adage that a woman’s presence on shipboard invites bad luck. Indeed, were it not for Bonny and Read, John “Calico Jack” Rackam’s crew would’ve suffered indignity along with defeat during its final adventure in the Caribbean. But more on that in a moment… Much of what we know about the early lives of Bonny and Read comes from a 1724 account titled A General History of the Robberies and Murders of the Most Notorious Pyrates, by Captain Charles Johnson (which some historians argue is a nom de plume for Robinson Crusoe author Daniel Defoe). A General History places Bonny’s birth in Kinsale, County Cork, Ireland, circa 1698. Her father, an attorney named William Cormac, had an affair with the family maid, prompting his wife to leave him. The maid, Mary Brennan, gave birth to Anne, and over time William grew so fond of the child he arranged for her to live with him. To avoid scandal, he dressed her as a boy and introduced her as the child of a relative entrusted to his care. When Anne’s true gender and parentage were discovered, William, Mary and their child emigrated to what is now Charleston, South Carolina. Mary died in 1711, at which point the teenaged Anne began exhibiting a “fierce and courageous temper,” reportedly murdering a servant girl with a case knife and beating half to death a suitor who tried to rape her. William, a successful planter, disapproved of his daughter’s rebellious ways; the endless rumors about her carousing in local taverns and sleeping with fishermen and drunks damaged his business. He disowned her when, in 1718, she married a poor sailor by the name of James Bonny. Anne and her new husband set off for New Providence (now Nassau) in the Bahamas, where James is said to have embarked on a career as a snitch, turning in pirates to Governor Woodes Rogers and collecting the bounties on their heads. Woodes, a former pirate himself, composed a “most wanted” list of ten notorious outlaws, including Blackbeard, and vowed to bring them all to trial. Anne, meanwhile, spent most of her time drinking at local saloons and seducing pirates; in A General History, Johnson contends that she was “not altogether so reserved in point of Chastity,” and that James Bonny once “surprised her lying in a hammock with another man.” Anne grew especially enamored of one paramour, John “Calico Jack” Rackam, so-called due to his affinity for garish clothing, and left Bonny to join Rackam’s crew. One legend holds that she launched her pirating career with an ingenious ploy, creating a “corpse” by mangling the limbs of a dressmaker’s mannequin and smearing it with fake blood. When the crew of a passing French merchant ship spotted Anne wielding an ax over her creation, they surrendered their cargo without a fight. A surprising number of women ventured to sea, in many capacities: as servants, prostitutes, laundresses, cooks and—albeit less frequently—as sailors, naval officers, whaling merchants or pirates. Anne herself was likely inspired by a 16th-century Irishwoman named Grace O’Malley, whose fierce visage (she claimed her face was scarred after an attack by an eagle) became infamous along the coast of the Emerald Isle. Still, female pirates remained an anomaly and perceived liability; Blackbeard, for one, banned women from his ship, and if his crew took one captive she was strangled and pitched over the side. Anne refused to be deterred by this sentiment. Upon joining Rackam’s crew, she was said to have silenced a disparaging shipmate by stabbing him in the heart. Most of the time Anne lived as a woman, acting the part of Rackam’s lover and helpmate, but during engagements with other ships she wore the attire of a man: loose tunic and wide, short trousers; a sword hitched by her side and a brace of pistols tucked in a sash; a small cap perched atop a thicket of dark hair. Between sporadic bouts of marauding and pillaging, pirate life was fairly prosaic; our modern associations with the profession draw more from popular entertainment—Peter Pan, The Pirates of Penzance, a swashbuckling Johnny Depp—than from historical reality. The notion of “walking the plank” is a myth, as are secret stashes of gold. “Nice idea, buried plunder,” says maritime historian David Cordingly. “Too bad it isn’t true.” Pirates ate more turtles than they drank rum, and many were staunch family men; Captain Kidd, for instance, remained devoted to his wife and children back in New York. Another historian, Barry R. Burg, contends that the majority of sexual dalliances occurred not with women but with male shipmates. Accounts vary as to how Anne met Mary Read. According to Johnson, Rackam’s ship conquered Mary’s somewhere in the West Indies, and Mary was among those taken prisoner. After the engagement, Anne, dressed in female attire, tried to seduce the handsome new recruit. Mary, perhaps fearing repercussions from Rackam, informed Anne she was actually a woman—and bared her breasts to prove it. Anne vowed to keep Mary’s secret and the women became friends, confidantes and, depending on the source, lovers. Learn more about Anne and Mary after the jump… They had much in common; Mary was also an illegitimate child. Her mother’s first child (this one by her husband) was a boy, born shortly after her husband died at sea. Mary’s mother-in-law took pity on the widow and offered to support her grandson until he was grown, but he died as well. Mary’s mother quickly became pregnant again, gave birth to Mary, and, in order to keep receiving money from her husband’s family, dressed her daughter to resemble her dead son. But her grandmother soon caught on and terminated the arrangement. To make ends meet, Mary’s mother continued dressing her as a boy and occasionally rented her out as a servant. Mary excelled at living as a man. Around age 13, she served as a “powder monkey” on a British man-of-war during the War of the Grand Alliance, carrying bags of gunpowder from the ship’s hold to the gun crews. Next she joined the Army of Flanders, serving in both the infantry and cavalry. She fell in love with her bunkmate and divulged her secret to him. Initially, the soldier suggested that Mary become his mistress—or, as Johnson put it, “he thought of nothing but gratifying his Passions with very little Ceremony”—but Mary replied, with no apparent irony, that she was a reserved and proper lady. After informing her entire regiment that she was a woman, she quit the army and married the solider, who died shortly before the turn of the 18th century. Mary resumed her life as a man and sailed for the West Indies on a Dutch ship, which was soon captured by English pirates. The crew, believing Mary to be a fellow Englishman, encouraged her to join them. Calico Jack Rackam served as the quartermaster of her new crew, and he, along with his shipmates, never suspected Mary’s true gender. She was aggressive and ruthless, always ready for a raid, and swore, well, like a drunken sailor. She was “very profligate,” recalled one of her victims, “cursing and swearing much.” Loose clothing hid her breasts, and no one thought twice about her lack of facial hair; her mates, most of them in their teens or early twenties, were also smooth-faced. It’s also likely that Mary suffered from stress and poor diet while serving in the army, factors that could have interrupted or paused her menstrual cycle. Initially, Rackam was jealous of Anne’s relationship with Mary, and one day burst into her cabin intending to slit her throat. Mary sat up and opened her blouse. Rackam agreed to keep Mary’s secret from the rest of the crew and continued to treat her as an equal. (He was also somewhat mollified when she took up with a male crewmate.) During battles Anne and Mary fought side by side, wearing billowing jackets and long trousers and handkerchiefs wrapped around their heads, wielding a machete and pistol in either hand. “They were very active on board,” another victim later testified, “and wiling to do any Thing.” The summer and early fall of 1720 proved especially lucrative for Rackam’s crew. In September they took seven fishing boats and two sloops near Harbor Island. A few weeks later, Anne and Mary led a raid against a schooner, shooting at the crew as they climbed aboard, cursing as they gathered their plunder: tackle, fifty rolls of tobacco and nine bags of pimento. They held their captives for two days before releasing them. Near midnight on October 22, Anne and Mary were on deck when they noticed a mysterious sloop gliding up alongside them. They realized it was one of the governor’s vessels, and they shouted for their crewmates to stand with them. A few obliged, Rackam included, but several had passed out from the night’s drinking. The sloop’s captain, Jonathan Barnett, ordered the pirates to surrender, but Rackam began firing his swivel gun. Barnett ordered a counterattack, and the barrage of fire disabled Rackam’s ship and sent the few men on deck to cowering in the hold. Outnumbered, Rackam signaled surrender and called for quarter. But Anne and Mary refused to surrender. They remained on deck and faced the governor’s men alone, firing their pistols and swinging their cutlasses. Mary, the legend goes, was so disgusted she stopped fighting long enough to peer over the entrance of the hold and yell, “If there’s a man among ye, ye’ll come up and fight like the man ye are to be!” When not a single comrade responded, she fired a shot down into the hold, killing one of them. Anne, Mary and the rest of Rackam’s crew were finally overpowered and taken prisoner. Calico Jack Rackam was scheduled to be executed by hanging on November 18, and his final request was to see Anne. She had but one thing to say to him: “If you had fought like a man, you need not have been hang’d like a dog.” Ten days later, she and Mary stood trial at the Admiralty Court in St. Jago de la Vega, Jamaica, both of them pleading not guilty to all charges. The most convincing witness was one Dorothy Thomas, whose canoe had been robbed of during one of the pirates’ sprees. She stated that Anne and Mary threatened to kill her for testifying against them, and that “the Reason of her knowing and believing them to be women then was by the largeness of their Breasts.” Anne and Mary were found guilty and sentenced to be hanged, but their executions were stayed—because, as lady luck would have it, they were both “quick with child.” Sources Books: Captain Charles Johnson. A General History of the Robberies and Murders of the Most Notorious Pyrates. London: T. Warner, 1724. Barry R. Burg. Sodomy and the Pirate Tradition: English Sea Rovers in the Seventeenth-Century Caribbean. New York: New York University Press, 1995. David Cordingly. Seafaring Women: Adventures of Pirate Queens, Female Stowaways, and Sailors’ Wives. New York: Random House, 2007. _________. Under the Black Flag: The Romance and the Reality of Life Among the Pirates. New York: Random House, 2006. _________. Pirate Hunter of the Caribbean: The Adventurous Life of Captain Woodes Rogers. New York: Random House, 2011. Margaret S. Creighton and Lisa Norling. Iron Men, Wooden Women: Gender and Seafaring in the Atlantic. Baltimore: John Hopkins University Press, 1996. Tamara J. Eastman and Constance Bond. The Pirate Trial of Anne Bonny and Mary Read. Cambria Pines, CA: Fern Canyon Press, 2000. Angus Konstam and Roger Kean. Pirates: Predators of the Seas. New York: Skyhorse Publishing, 2007. Elizabeth Kerri Mahon. Scandalous Women: The Lives and Loves of History’s Most Notorious Women. New York: Penguin Group, 2011. C.R. Pennell. Bandits at Sea: A Pirates Reader. New York: New York University Press, 2011. Diana Maury Robin, Anne R. Larsen, Carole Levin. Encyclopedia of Women in the Renaissance: Italy, France, and England. Articles: “Scholars Plunder Myths About Pirates, And It’s Such A Drag.” Wall Street Journal, April 23, 1992; “West Indian Sketches.” New Hampshire Gazette, April 10, 1838; “How Blackbeard Met His Fate.” Washington Post, September 9, 1928; “Seafaring Women.” Los Angeles Times, March 8, 1896; “Capt. Kidd and Others.” New York Times, January 1, 1899; “Female Pirates.” Boston Globe, August 9, 1903. Karen Abbott is a contributing writer for history for Smithsonian.com and the author of the books Sin in the Second City and American Rose. Her forthcoming book, Liar, Temptress, Soldier, Spy, will be published by HarperCollins in September.
f6fb393057a34b0593cd838c67688794
https://www.smithsonianmag.com/history/igorrote-tribe-traveled-world-these-men-took-all-money-180953012/
The Igorrote Tribe Traveled the World for Show And Made These Two Men Rich
The Igorrote Tribe Traveled the World for Show And Made These Two Men Rich A group of tribespeople danced with jerky movements as a man, barefoot and wearing only a g-string, dragged a dog by a rope. The mutt snapped and snarled. Then with one deft stroke, the man slit the animal’s throat before chopping its lifeless body into pieces and throwing it into a pot. This was the Igorrote Village at Coney Island, and in 1905, it was the talk of America. The Igorrotes, or Bontoc Igorrotes to use their full tribal name, were from a remote region in the far north of the Philippines named Bontoc. Truman Hunt, an opportunistic former medical doctor turned showman, came up with the idea of transporting 50 Igorrotes to America and putting them on display in a mocked-up tribal village at Coney Island. At its heart, The Lost Tribe of Coney Island is a tale of what happens when two cultures collide in the pursuit of money, adventure, and the American Dream. It is a story that makes us question who is civilized and who is savage. Hunt was a Spanish-American War veteran and former lieutenant governor of Bontoc, where he had become a trusted friend of the Igorrotes. The United States took control of the Philippines from Spain as part of the terms of the 1898 Treaty of Paris ending the war between the two nations. The U.S. also received stewardship of Puerto Rico and Guam and ceded its claim to Cuba. In the following years, however, Filipino nationalists uninterested in becoming the subjects of yet another colonial power, fought a prolonged three-year war with the United States, leading to the deaths of 4,200 Americans and casualties on the Filipino side that numbered in the hundreds of thousands, including combatants and civilians. The assumption of American control over the overseas territory prompted deep soul-searching at home. Was it right for America to acquire an overseas empire? When, if ever, would the Filipinos be ready to take over the responsibility of governing themselves? Faced with growing public opposition at home, the U.S. launched a pacification process lead by future president William Howard Taft that provided for Filipino self-governance and eventual independence. In early 1905, Truman Hunt traveled to Bontoc and made the Bontoc Igorrotes an audacious offer: if they agreed to leave their family and friends behind for a year and journey with him to United States to put on a show of their native customs, he would pay them each $15 a month in wages. At Coney Island, the Igorrotes performed a distorted version of their tribal rituals. They sang and danced, they held sham weddings and dog feasts with mutts brought from the pound. They were visited by millions of ordinary Americans, along with anthropologists, linguists, famous singers and actors, and even Alice Roosevelt, President Theodore Roosevelt’s daughter. The tribesmen, women and children inspired poems, newspaper cartoons, advertising slogans and jigsaw puzzles, and were written about in the New York Times, the Washington Post and by the Associated Press. Before long, the Igorrotes had made Hunt a fortune. But he was spending money as quickly as the Igorrotes earned it. He had no desire to share his lucrative trade with anyone. But, hot on Hunt’s heels, another group of Igorrotes arrived in America. They were traveling with Richard Schneidewind, another Spanish-American war veteran and a former cigar salesman. The two men could not have been more different. Hunt was a charming risk-taker, and came to regard the tribespeople as a commodity. Schneidewind, who had been married to a Philippine woman who died giving birth to their first son, treated “his” tribespeople like family. He invited them to his home to meet his son and to eat dinner with them. Schneidewind took his Igorrote exhibition group to the 1905 Lewis and Clark Centennial Exposition in Portland, Oregon, then on to Chutes Park in Los Angeles, where they were a huge hit. Hunt was furious. He split his tribespeople in to several troupes to maximize his profits. Hunt’s groups toured the country, making dozens of stops, lasting anything from a few days to several weeks. The rivalry between Hunt and Schneidewind was intense. In May, 1906, Hunt and Schneidewind ended up at competing parks in Chicago. There the two showmen did everything they could to undermine each other’s exhibits. Hunt rubbished Schneidewind’s reputation to his newspaper friends. Schneidewind and his business partner, Edmund Felder, wrote to the head of the Bureau of Insular Affairs, the U.S. government agency located within that War Department charged with administering the nation’s newly acquired territories. Their letter reported that the village operated by Hunt and his associates at Sans Souci Park in Chicago was in a terrible condition. The 18 men and women in Hunt’s group, they wrote, were crammed into three small A-frame tents in a muddy scrap of land beneath the roller coaster. Their description, though arguably motivated more by business rivalry than concern for their fellow human beings, was accurate. A member of the public -- possibly put up to it by Schneidewind and Felder -- wrote to the Bureau complaining that the Bontoc Igorrotes were living in squalor. There were further rumors that Hunt had stolen the tribe’s wages and that two men in the group had died on the road and that the showman had failed to have their bodies buried. Both Hunt and Schneidewind had brought their Igorrote groups into America with permission from the U.S. government, an entity with a clear incentive to portray the people of the Philippines as primitive. How could such a society govern itself if it was filled with citizens as “backwards” as the Igorrotes?  If it was true that Hunt was mistreating the Igorrotes, the government could hardly afford to be engulfed in a major scandal that could turn public opinion even further against a permanent presence in the Philippines. Alarmed, the chief of the Bureau of Insular Affairs, Clarence Edwards, and his deputy, Frank McIntyre, called in one of their agents, Frederick Barker, and asked him to investigate the claims. When Hunt received a tip-off that the Bureau was sending a man to examine his Igorrote enterprise, he fled town. He went on the run, taking some of the tribespeople with him. A manhunt followed as Pinkerton detectives, the government agent, creditors and a woman who accused Hunt of bigamy pursued the showman across America and Canada. Hunt proved himself to be a slippery opponent. Finally, in October 1906, he was arrested on multiple charges of stealing from the Igorrotes and sentenced for 18 months in the workhouse after a sensational trial in Memphis. With his rival out of the way, Schneidewind emerged as the leading showman in the Igorrote exhibition trade. In the winter of 1906, Schneidewind returned to the Philippines to collect another Igorrote group and embarked on a second tour of America. A third U.S. tour followed in 1908. In 1911, despite vociferous opposition from Bontoc tribal elders and officials of nearby towns, Schneidewind was permitted to take a group of 55 Igorrotes to Europe, where they exhibited in France, Scotland, England, the Netherlands and Belgium. Schneidewind and his associates were unfamiliar with the European entertainment business, and in 1913, after two years on the road, they ran into serious financial difficulties. What happened next was alarmingly reminiscent of Truman Hunt’s tour. According to American newspaper reports, in the winter of 1913 a group of starving Igorrotes was found wandering the streets of Ghent, Belgium. The group’s interpreters, Ellis Tongai and James Amok, wrote to President Woodrow Wilson begging for his assistance. In their letter, they complained that they had not been paid for many months and reported the deaths of nine members of their group, including five children. Schneidewind told the Igorrotes that if they stayed on and continued working for him until the 1915 San Francisco Exposition then they would earn a handsome wage, allowing them to return home rich. Despite the hardships they had endured, about half of the group wished to stay on in Europe, a sign perhaps that Schneidewind’s troubles owed more to incompetence than to cruelty or a lack of compassion for the Filipinos. But, fearing another scandal, the U.S. government was unwilling to give Schneidewind another chance and decided they must intervene. In December, 1913, the U.S. consul in Ghent escorted the tribespeople to Marseilles to catch a boat back to Manila. This disastrous venture did little to help the image of the Igorrote show trade. The Philippine Assembly took action and, in 1914, passed legislation that banned the exhibition of groups of Filipino tribespeople abroad. As a measure of the seriousness with which the Philippine lawmakers regarded the subject, the ban was included as an amendment to a new Anti-Slavery Act. Schneidewind, like Truman before him, exited the Igorrote show trade. For a full decade, starting in 1905, the Igorrotes had been the greatest show in town, thrilling and scandalizing the American public, and filling the nation’s newspapers. But in the intervening period, they disappeared from the public consciousness. One of the few extant public acknowledgements of the Igorrote show is in Ghent, where an initiative to commemorate the city’s World Exhibition of 1913 lead to the naming of streets and tunnels after notable participants of this historical event, including Timicheg, one of the nine Igorrotes who died on Schneidewind’s European tour. The Philippine Ambassador to Belgium remarked at the time that it was "commendable that the City of Ghent has not only chosen to celebrate the achievements relating to the 1913 expo, but has been able to balance this by commemorating those who experienced difficulties to participate in this event". More than a century on, the time has come to tell the Igorrote's incredible story. For more information on The Lost Tribe of Coney Island, visit claireprentice.org Claire Prentice is a journalist and writer whose book The Lost Tribe of Coney Island: Headhunters, Luna Park, and the Man Who Pulled Off the Spectacle of the Century explores the true story of the Igorrotes.
6268307b4000aeff771b26bf37098262
https://www.smithsonianmag.com/history/illustrious-history-misquoting-winston-churchill-180953634/
The Illustrious History of Misquoting Winston Churchill
The Illustrious History of Misquoting Winston Churchill “If I were married to you, I’d put poison in your coffee,” Lady Astor once famously remarked to Winston Churchill. “If I were married to you,” he replied, “I’d drink it.” This month marks 50 years since the death of one of history’s most quotable people. Churchill’s speeches, letters and published works contain an estimated 15 million words—“more than Shakespeare and Dickens combined,” London Mayor (and Churchill biographer) Boris Johnson tells Smithsonian. The downside to Churchill’s prolificacy is it’s easy to put words in his mouth. Like Oscar Wilde and Mark Twain, Churchill attracts false attributions like a magnet. “People tend to make them up,” says Richard Langworth, the editor of four books of authenticated Churchill quotations, who estimates that at least 80 famous sayings attributed to the British Bulldog weren’t necessarily uttered by him. That infamous Lady Astor exchange, for instance, probably took place between her and Churchill’s friend F.E. Smith, a statesman, and even then Smith was perhaps quoting an old joke. Churchill’s “Courage is what it takes to stand up and speak; courage is also what it means to sit down and listen”—recently quoted by Washington Redskins quarterback Robert Griffin III—has no known connection to Churchill at all. But connoisseurs of Churchillian ripostes can rest easy that his legendary rejoinder to a female politician who called him drunk is a confirmed matter of historical record—even if he did adapt it from a line in the W.C. Fields movie It’s a Gift. “Tomorrow, I shall be sober,” Churchill replied, “and you will still be disgustingly ugly.” (In the film, the victim was merely “crazy.”) Quoting Churchill accurately is not only tricky—it can be costly. Because of a decades-old copyright arrangement with his literary agency, Churchill’s estate charges a fee to quote from almost everything he published, including speeches. The fees go to a trust controlled by institutions and heirs. The licensing requirement doesn’t cover Churchill’s off-the-cuff remarks or, says his estate’s agent Gordon Wise, brief quotations in journalism or criticism permitted as fair use. For his 2012 book Churchill Style, Barry Singer, who owns a Churchill-themed bookstore in Manhattan, says he paid 40 cents a word to quote from his subject. “I literally cut quotes to come in under a certain budget,” Singer says. Patient writers can take heart: British copyrights expire 70 years after the author’s death. A powerful, persuasive speaker and notorious wit, Churchill is one of the twentieth century’s most oft-quoted leaders—and one frequently misquoted or quoted out of context. Yet his actual remarks were often much wiser and wittier than reported. "Churchill By Himself" is the first exhaustive, attributed, and annotated collection of Churchill sayings. Edited by a longtime Churchill scholar and authorized by the Churchill estate, the quotations provide the first wholly accurate record of the esteemed statesman’s words. Max Kutner is a New York City-based journalist who has written for Newsweek, Boston magazine and Vice.com. He was an editorial intern for Smithsonian in 2014.
c43c976d0da135146a01ae2d879280e3
https://www.smithsonianmag.com/history/incomplete-history-told-new-yorks-kgb-museum-180971458/
The Incomplete History Told by New York’s K.G.B. Museum
The Incomplete History Told by New York’s K.G.B. Museum After downing a second vial of “baby blue truth serum,” which mysteriously tasted like vodka, I admitted something to myself. I wasn’t enjoying the sardine-and-hard-boiled-egg appetizer at the opening night party for the new KGB Spy Museum in downtown Manhattan. Everything else on that chilly January night, however, was otlichno. As an accordionist played post-war Russian pop songs, the assembled mix of media and other guests toured the museum. The native Russian guides highlighted some of the 3,500 items on display, with a break for us to strap in and pose in the replica psychiatric hospital torture chair (drilling into teeth to the jawline, thankfully not included). Among the other stops in the tour were picking up the phone to receive messages from former enemies like Nikita Khrushchev and Yuri Andropov (or current frenemy Vladimir Putin), getting creeped out by the one-night only live model in the straitjacket (normally, a mannequin), and examining a half-century’s worth of espionage devices that defined the Cold War. My personal favorite? The “Deadly Kiss,” a single-shot lipstick gun the museum claims was specifically designed for female spies to use against targets in the boudoir. Sex sells. And kills. However, on a return visit a few days later in the sober morning hours, the museum had a different feel. Donning a full-length leather commissar’s coat and military hat for the Instagram-ready photo at K.G.B. officer’s desk was kitschy fun in the moment, but the genocidal history of the Soviet regime that undergirds the history of it all can easily get lost in the whole Spy vs. Spy, Get Smart, “Moose and Squirrel” vibe. The KGB Spy Museum opened last month and chronicles the evolution of the Soviet secret police from the 1917 founding of Vladimir Lenin’s Cheka on through Joseph Stalin’s NKVD, led by mass murderer Lavrentiy Beria. (Referred to by Stalin as “our Himmler,” Beria’s bio and bust are an early tour “highlight.”) The bulk of the museum is dedicated to the Komitet Gosudarstvennoy Bezopasnosti (K.G.B.), in English the “Committee for State Security,” founded in 1954 and active until 1991 with the dissolution of the Soviet Union. The USSR used the K.G.B. to quell dissent, by whatever violent means necessary, and run general surveillance on its citizenry as part of its efforts to maintain Communist order. During the Cold War, the K.G.B. rivaled the C.I.A. around the globe, but primarily carried out its most brutal acts behind the Iron Curtain. A 1980 U.S. intelligence report asserted that at its peak, the K.G.B. employed some 480,000 people (along with millions of informers) and infiltrated every aspect of life in the Soviet Union—one dissident Orthodox priest said in the 1970s that “one hundred percent of the clergy were forced to cooperate with the K.G.B.” Although no official accounting of the total atrocities committed by the K.G.B. exists, estimates place multiple millions of Russians in forced labor camps known as gulags, or to their deaths, both at home and abroad. The K.G.B. was instrumental in crushing the Hungarian Revolution of 1956 and the Prague Spring of 1968.When a collection of documents related to the K.G.B.’s work in Prague was released and examined by reporters and historians, it became abundantly clear that of all the weapons used by the agency, fear was the most pervasive. "They considered the worst enemies those who could influence public opinion through media,” said Milan Barta, a senior researcher at Prague's Institute for Study of Totalitarian Regimes in a 2014 interview with the Washington Examiner. Unsuccessful plots by the K.G.B. included the kidnapping of novelist Milan Kundera and the silencing of other key public figures. The brains behind the KGB Spy Museum, are not professionally trained curators or historians, however, and instead are a Lithuanian father-daughter team, Julius Urbaitis and Agne Urbaityte. Urbaitis, 55, began collecting World War II items as a young man. His taste for authentic artifacts is obsessive—at one point he had the largest collection of gas masks in Europe. Their display is certainly extensive, but it is personal, not one curated by academics . “Our mission is to tell the exact historical information, no politics, to show what technologies were used then, and what are used now,” says Urbaityte, 29, who, along with her father, only came to New York from Lithuania three months ago and are anxiously awaiting work visas. “We have extremely rare items and there is no collection like this in the world.” Urbaitis is a writer, scholar, and lecturer, but first and foremost, a collector. Not everything on view in his museum has dates or labels about the provenance, putting visitors in the position to take a leap of faith along with the collectors. For example, the write-up of the lipstick gun says it was “most likely used in the bedroom…” In 2014, after some three decades of assembling his items, the duo opened the Atomic Bunker Museum, housed 20 feet underground, in Kaunas, Lithuania. In the last few years, Lithuanian tourism has been on the rise, and their museum became a must-see attraction. Inspired by the museum’s popularity, a group of anonymous American collectors asked Urbaitis to evaluate their artifacts, which ultimately led to an unnamed entrepreneur funding the for-profit KGB Spy Museum (and its presumably whopping monthly rent). “When Dad gets interested in something, he wants to know everything about it,” says Urbaityte. “Whatever it is—motorcycles, old cars, listening devices—he figures out how it works, becomes an expert, and moves on to the next topic. He understands how [every object] works in the museum.” As he gave an interview to Channel One Russia clad in a trench coat and blue-tinted aviator sunglasses, Urbaitis looked the part of the dashing Cold War spy, and his collection is certainly thorough. It’s laid out in a snaking format with various sections dedicated to bugs, lie detectors, cameras of all sizes, cassette recorders, dictaphones, night vision goggles, radios, and a corner section with concrete prison doors. A standout piece is the Great Seal, better known as “The Thing,” a wooden U.S. coat of arms given as a gift from Soviet schoolchildren to American ambassador W. Averell Harriman in 1943. It hung in his Moscow office until 1952, but hidden inside was a 800 megahertz radio signal that “acted like a mirror reflecting light” and required no power supply for eavesdropping. Urbaitis collected sillier items, too. Rubber bald head wigs and community theater clown makeup provide a good reminder that not all spy technology was sophisticated. Kids can get their espionage on as well, playing “Spot the Spy” on interactive tablets arranged amidst the cutting-edge suitcase phones of the 1960s. At $25 a pop—$43.99 for a two-hour guided stroll—the KGB Spy Museum offers a thoroughly capitalist look at the decidedly Communist spy tools, from the Bolshevik era through the F.S.B. of today. Among its most current objects is a hollowed out “tree with eyes” with a hard drive from 2015. Altogether, touring the museum provides an engaging journey of the development of Soviet spy technology, but the bust of Joseph Stalin, a ruthless dictator who killed 20 million of his own people, haunting the entrance to the museum looms over the visitor experience as well. Yet in order to remain “apolitical,” Urbaitis and his daughter run the risk of ignoring the geopolitical realities past and present. The technological specs and encyclopedia-style write-ups of the items don’t put the K.G.B. reign of terror in a larger global context. In the New Yorker, Russian-American journalist Masha Gessen writes the museum resembles one you might find in Russia, “a place where the K.G.B. is not only glorified and romanticized but also simply normalized.” It’s understandable why Urbaityte refers to the museum as “historical” and “educational” as opposed to “political”—the very word politics causes some people to roll their eyes and move onto the M&M’s World, but ignoring the 21st-century state of affairs sells short the importance and evolution of the collection itself. Showcasing a facsimile of the ricin-tipped umbrella used to assassinate dissident Georgi Markov in 1978 is worthwhile, but not mentioning the 2006 poisoning of former Russian spy Alexander Litvinenko, at the behest of the former K.G.B. agent who okayed the meddling in the 2016 U.S. Presidential election, is conspicuous. A bigger concern is the absence of the full picture of abject human suffering caused by the Soviet state police. The fine print of exhibit labels share some gory details of various torture apparatuses, but the museum includes no all-encompassing look at the K.G.B. atrocities and how it relates to the 21st-century. Take Afghanistan, for instance. In The Sword and the Shield, British historian Christopher Andrew and former K.G.B. officer Vasili Mitrokhin (who defected to the U.K. in 1992 with 25,000 pages of documents) detail how the K.G.B. concealed the horrors of the Afghan War—15,000 Russian soldiers killed, a million Afghani deaths, and four million refugees—from the Soviet people. You won’t find mention of it, or how it gave rise to the Taliban, in the museum, even as new American museums have sought to tell the full ugly chapters of American history. That list includes a reconciliation with lynchings and racial terror at The National Memorial for Peace and Justice and the tucked-away corner dedicated to those who jumped to their deaths at the 9/11 Memorial and Museum. Gessen, postulates no American museum would ever present the head of Adolf Hitler out on the sidewalk, adding, “And yet, for the American public, an entertaining presentation of what was probably the most murderous secret-police organization in history seems both unproblematic and commercially promising.” Nor is there a mention of the hundreds of thousands of Lithuanians murdered or sent to the gulags during the Soviet occupation. 1.6-million Russians-Americans live in the New York metropolitan area, with some 600,000 in New York City alone. Considering the K.G.B. only disbanded in 1991, and that the current president of Russian, Vladimir Putin, was himself once a K.G.B. agent, many of the museum’s neighbors likely lived through the state security nightmare and might want their pain acknowledged beyond video-monitoring birdhouses and ashtrays that listen to you smoke. The museum’s physical collection is astounding, and by the metric of showing off how espionage technology evolved, it succeeds. Visitors should know, however, there’s a lot more to K.G.B. history than meets the spy. Editor's note, February 9, 2019: An earlier version of this story included a photo of Lenin, rather than Stalin, at the entrance to the museum. We have updated it to include a new photo featuring the correct Soviet leader. Originally from Montana, Patrick Sauer is a freelance writer based in Brooklyn. His work appears in Vice Sports, Biographile, Smithsonian, and The Classical, among others. He is the author of The Complete Idiot’s Guide to the American Presidents and once wrote a one-act play about Zachary Taylor.
cc3418fe384cdd0928994698b653d2fa
https://www.smithsonianmag.com/history/inside-founding-fathers-debate-over-what-constituted-impeachable-offense-180965083/
Inside the Founding Fathers’ Debate Over What Constituted an Impeachable Offense
Inside the Founding Fathers’ Debate Over What Constituted an Impeachable Offense The Constitutional Convention in Philadelphia was winding down, the draft of the United States’ supreme law almost finished, and George Mason, the author of Virginia’s Declaration of Rights, was becoming alarmed. Over the course of the convention, the 61-year-old had come to fear the powerful new government his colleagues were creating. Mason thought the president could become a tyrant as oppressive as George III. So on September 8, 1787, he rose to ask his fellow delegates a question of historic importance. Why, Mason asked, were treason and bribery the only grounds in the draft Constitution for impeaching the president? Treason, he warned, wouldn’t include “attempts to subvert the Constitution.” After a sharp back-and-forth with fellow Virginian James Madison, Mason came up with another category of impeachable offenses: “other high crimes and misdemeanors.” Americans have debated the meaning of this decidedly open-ended phrase ever since. But its inclusion, as well as the guidance the Founders left regarding its interpretation, offers more protection against a dangerous executive power than many realize. Of all the Founders who debated impeachment, three Virginians—Mason, Madison and delegate Edmund Randolph—did the most to set down a vision of when Congress should remove a president from office. Though the men had very different positions on the Constitution, their debates in Philadelphia and at Virginia’s ratifying convention in Richmond produced crucial definitions of an impeachable offense. And their ultimate agreement—that a president should be impeached for abuses of power that subvert the Constitution, the integrity of government, or the rule of law—remains essential to the debates we’re having today, 230 years later. The three men took on leading roles at the Constitutional Convention almost as soon as it convened on May 25, 1787. In the first week, Randolph, the 33-year-old Virginia governor, introduced the Virginia Plan, written by Madison, which became the starting point for the new national government. Mason, one of Virginia’s richest planters and a major framer of his home state’s new constitution, was the first delegate to argue that the government needed a check on the executive’s power. “Some mode of displacing an unfit magistrate” was necessary, he argued on June 2, without “making the Executive the mere creature of the Legislature.” After a short debate, the convention agreed to the language proposed in the Virginia Plan: the executive would “be removable on impeachment and conviction of malpractice or neglect of duty” – a broad standard that the delegates would later rewrite. Mason, Madison, and Randolph all spoke up to defend impeachment on July 20, after Charles Pinckney of South Carolina and Gouverneur Morris of Pennsylvania moved to strike it. “[If the president] should be re-elected, that will be sufficient proof of his innocence,” Morris argued. “[Impeachment] will render the Executive dependent on those who are to impeach.” “Shall any man be above justice?” Mason asked. “Shall that man be above it who can commit the most extensive injustice?” A presidential candidate might bribe the electors to gain the presidency, Mason suggested. “Shall the man who has practiced corruption, and by that means procured his appointment in the first instance, be suffered to escape punishment by repeating his guilt?” Madison argued that the Constitution needed a provision “for defending the community against the incapacity, negligence, or perfidy of the Chief Magistrate.” Waiting to vote him out of office in a general election wasn’t good enough. “He might pervert his administration into a scheme of peculation”— embezzlement—“or oppression,” Madison warned. “He might betray his trust to foreign powers.” Randolph agreed on both these fronts. “The Executive will have great opportunities of abusing his power,” he warned, “particularly in time of war, when the military force, and in some respects the public money, will be in his hands.” The delegates voted, 8 states to 2, to make the executive removable by impeachment. The Virginia delegates borrowed their model for impeachment from the British Parliament. For 400 years, English lawmakers had used impeachment to exercise some control over the king’s ministers. Often, Parliament invoked it to check abuses of power, including improprieties and attempts to subvert the state. The House of Commons’ 1640 articles of impeachment against Thomas Wentworth, Earl of Strafford, alleged “that he... hath traiterously endeavored to subvert the Fundamental Laws and Government of the Realms... and in stead thereof, to introduce Arbitrary and Tyrannical Government against Law.” (The House of Lords convicted Strafford, who was hanged in 1641.) The U.S. Constitution lays out a process that imitated Britain’s: The House of Representatives impeaches, as the House of Commons did, while the Senate tries and removes the official, as the House of Lords did. But unlike in Britain, where impeachment was a matter of criminal law that could lead to a prison sentence, the Virginia Plan proposed that the impeachment process lead only to the president’s removal from office and disqualification from holding future office. After removal, the Constitution says, the president can still be indicted and put on trial in regular courts. Still, by September, the delegates hadn’t resolved impeachment’s toughest question: What exactly was an impeachable offense? On September 4, the Committee on Postponed Matters, named to resolve the convention’s thorniest disputes, had replaced the “malpractice or neglect of duty” standard for impeachment with a much narrower one: “treason and bribery.” Limiting impeachment to treason and bribery cases, Mason warned on September 8, “will not reach many great and dangerous offences.” To make his case, he pointed to an impeachment taking place in Great Britain at the time—that of Warren Hastings, the Governor-General of India. Hastings had been impeached in May 1787, the same month the U.S. constitutional convention opened. The House of Commons charged Hastings with a mix of criminal offenses and non-criminal offenses, including confiscating land and provoking a revolt in parts of India. Hastings’ trial by the House of Lords was pending while the American delegates were debating in Philadelphia. Mason argued to his fellow delegates that Hastings was accused of abuses of power, not treason, and that the Constitution needed to guard against a president who might commit misdeeds like those alleged against Hastings. (In the end, The House of Lords acquitted Hastings in 1795.) Mason, fearful of an unchecked, out-of-control president, proposed adding “maladministration” as a third cause for impeaching the president. Such a charge was already grounds for impeachment in six states, including Virginia. But on this point, Madison objected. The scholarly Princeton graduate, a generation younger than Mason at age 36, saw a threat to the balance of powers he’d helped devise. “So vague a term will be equivalent to a tenure during pleasure of the Senate,” he argued. In other words, Madison feared the Senate would use the word “maladministration” as an excuse to remove the president whenever it wanted. So Mason offered a substitute: “other high crimes and misdemeanors against the State.” The English Parliament had included a similarly worded phrase in its articles of impeachment since 1450. This compromise satisfied Madison and most of the other Convention delegates. They approved Mason’s amendment without further debate, 8 states to 3, but added “against the United States,” to avoid ambiguity. Unfortunately for everyone who’s argued since about what an impeachable offense is, the convention’s Committee on Style and Revision, which was supposed to improve the draft Constitution’s language without changing its meaning, deleted the phrase “against the United States.” Without that phrase, which explained what constitutes “high crimes,” many Americans came to believe that “high crimes” literally meant only crimes identified in criminal law. Historians debate whether the Founders got the balance on impeachment just right or settled for a vague standard that’s often too weak to stop an imperial president. Consider the 1868 impeachment of President Andrew Johnson, who escaped removal from office by one vote in the Senate. John F. Kennedy, in his 1955 book Profiles In Courage, celebrated Senator Edmund Ross’ swing vote for Johnson’s acquittal. Kennedy, echoing Madison’s fears of a Senate overthrowing presidents for political reasons, declared that Ross “may well have preserved for ourselves and posterity Constitutional government in the United States.” But Johnson spent most of his presidency undermining Reconstruction laws that Congress passed, over his vetoes, to protect the rights and safety of black Southerners. “To a large degree, the failure of Reconstruction could be blamed alone on President Johnson’s abuse of his discretionary powers,” Michael Les Benedict wrote in his 1973 book, The Impeachment and Trial of Andrew Johnson. Yet the House rejected a broad attempt to impeach Johnson for abuse of power in 1867, because many congressmen felt a president had to commit a crime to be impeached. Instead, Johnson was impeached in 1868 for firing Secretary of War Edwin Stanton in violation of the Tenure of Office Act. That law was arguably unconstitutional – a factor that contributed to the Senate’s decision to acquit. The 1974 House Judiciary Committee put the British example favored by Mason to use during Nixon’s Watergate scandal. “High crimes and misdemeanors,” the committee’s staff report argued, originally referred to “damage to the state in such forms as misapplication of funds, abuse of official power, neglect of duty, encroachment on Parliament’s prerogatives, corruption, and betrayal of trust,” allegations that “were not necessarily limited to common law or statutory derelictions or crimes.” The committee approved three articles of impeachment against Nixon on these grounds, charging him with obstructing justice and subverting constitutional government. The full House never voted on impeachment, but the proposed articles helped force the president’s resignation two weeks later. When Madison, Mason, and Randolph reunited in Richmond in June 1788 for Virginia’s convention to ratify the Constitution, they continued their debate on the question of impeachable offenses. By then each man had taken a different position on the Constitution. Madison had emerged as its main architect and champion, and Mason as a leading opponent who declared “it would end either in monarchy, or a tyrannical aristocracy.” Randolph, meanwhile, had voted against the Constitution in Philadelphia in September 1787, but swung his vote to yes in 1788 after eight other states had ratified it. Their disagreement illuminates the discussion over presidential powers in the modern era. When Mason argued that “the great powers of Europe, as France and Great Britain,” might corrupt the president, Randolph replied that it would be an impeachable offense for the president to violate the Constitution’s emoluments clause by taking payments from a foreign power. Randolph was establishing that violations of the Constitution would constitute high crimes and misdemeanors – and so would betraying the U.S. to a foreign government. And in an argument with Madison, Mason warned that a president could use the pardon power to stop an inquiry into possible crimes in his own administration. “He may frequently pardon crimes which were advised by himself,” Mason argued. “If he has the power of granting pardons before indictment, or conviction, may he not stop inquiry and prevent detection?” Impeachment, Madison responded, could impose the necessary check to a president’s abuse of the pardon power. “If the President be connected, in any suspicious manner, with any person,” Madison stated, “and there be grounds to believe he will shelter him, the House of Representatives can impeach him.” Erick Trickey is a writer in Boston, covering politics, history, cities, arts, and science. He has written for POLITICO Magazine, Next City, the Boston Globe, Boston Magazine, and Cleveland Magazine
05ccd8b33044b6fbc6488a86c540e2cc
https://www.smithsonianmag.com/history/inside-intense-rivalry-between-eliot-ness-and-j-edgar-hoover-180952784/
Inside the Intense Rivalry Between Eliot Ness and J. Edgar Hoover
Inside the Intense Rivalry Between Eliot Ness and J. Edgar Hoover The massive warehouse took up a block on Chicago’s South Wabash Avenue. Shades and wire screens blocked the windows. Iron bars reinforced the double doors. The sign read “The Old Reliable Trucking Company,” but the building gave off the yeasty odor of brewing beer. It was an Al Capone operation. At dawn on April 11, 1931, a ten-ton truck with a steel bumper rammed through the double doors. Alarm bells clanged as Prohibition agents rushed inside and nabbed five brewery workers. Then they set about blowtorching the brewing equipment, upending vats, hacking barrels open. They sent a cascade of beer worth the modern equivalent of $1.5 million into the sewer. Eliot Ness had struck again. “It’s funny, I think, when you back up a truck to a brewery door and smash it in,” Ness told a reporter. No one had so brazenly challenged Capone before, but then, the Prohibition Bureau had few agents like Ness. In a force known for corruption and ineptitude, he was known for turning down bribes bigger than his annual salary. He was 28, a college graduate, with blue-gray eyes, slicked-back dark hair and a square-set jaw, and he had a way with the press. When he took to calling his men “the Untouchables,” because the abuse they took from Capone’s men reminded Ness of India’s lowest caste, reporters adopted the nickname as a metaphor for the squad’s refusal to take bribes. Soon newspapers across the country were celebrating Ness as Capone’s nemesis. But two years later, Ness’ flood of raids, arrests and indictments was running dry. Capone was in prison, the Untouchables had been disbanded and the last days of Prohibition were ticking away. Ness had been reassigned to Cincinnati, where he chased moonshiners across Appalachian foothills. Hoping for another chance at glory, he applied for a job with J. Edgar Hoover’s budding Division of Investigation—the future FBI. A former U.S. attorney in Chicago wrote to recommend Ness. Hoover expedited a background investigation. One of his agents crisscrossed the Windy City and collected testimonials to the applicant’s courage, intelligence and honesty. The current U.S. attorney told the agent Ness was “above reproach in every way.” Back in the Chicago Prohibition Bureau office for a weekend in November 1933, Ness spoke with a friend on the phone about his prospects. “Boss is using his influence,” he said. “Everything appears to be OK.” He said he would take nothing less than special agent in charge of the Chicago office. He said it loud enough for another Prohibition agent to overhear. Soon word reached the Division of Investigation’s current special agent in charge in Chicago. After seeing Ness’ references, Hoover wrote him on November 27 to note that Division men started at $2,465 a year—well below the $3,800 Ness had listed as his senior Prohibition agent’s pay. “Kindly advise this Division whether you would be willing to accept the regular entrance salary in the event it is possible to utilize your services,” Hoover asked. There is no record that Ness responded. Maybe he never got a chance. The next day, the special agent in charge in Chicago began dispatching a string of memos to headquarters in Washington, D.C.—41 pages of reports, observations and transcripts. The memos make up the core of a 100-page FBI file on Ness that was held confidentially for eight decades, until it was released to me under a Freedom of Information Act request. Amid a catalog of innuendo and character assassination, the file includes a troubling allegation that the lead Untouchable was anything but. Beyond that, it illuminates the vendetta Hoover pursued against Ness throughout their careers—even after Ness was in his grave. That vendetta was launched just a week after the director had inquired about Ness’ salary requirements. On December 4, 1933—the day before Prohibition ended—Hoover sat with the file at his desk. Across a memo reporting the overheard phone conversation, he scrawled, “I do not think we want this applicant.” Eliot Ness’ troubles began on a raid he did not make. On August 25, 1933, a Polish immigrant named Joe Kulak was cooking off a batch of moonshine in the basement of a house on Chicago’s South Side when three Prohibition agents raided his 200-gallon still. Kulak handed them two notes, one typewritten, one penciled. “This place is O.K.’d by the United States Senator’s Office,” read the typewritten note, which bore the name of an aide to Senator J. Hamilton Lewis of Illinois. The penciled note carried the same message but added Lewis’ Chicago office address and: “Or see E. Ness.” Until then, E. Ness would have seemed destined to join forces with Hoover. Born in 1902 on the South Side, he was raised by Norwegian-immigrant parents. Peter Ness, a baker, and his wife, Emma, instilled in their youngest son a strict sense of integrity. After earning a bachelor’s degree in business at the University of Chicago, he followed his brother-in-law into the Prohibition Bureau. Later he returned to the university to study under the pioneering criminologist August Vollmer, who argued that beat cops—typically poorly trained, beholden to political patrons and easily corrupted—should be replaced by men who were insulated from politics and educated as thoroughly in their profession as doctors and lawyers. The United States needed such lawmen as the corruption of Prohibition gave way to more desperate crimes—the bank robberies and kidnappings of the Great Depression. In the summer of 1933, U.S. Attorney General Homer Cummings declared a new war on crime and gave Hoover free rein to build the once-obscure Bureau of Investigation into a powerful new division (which would be renamed the FBI in 1935). Hoover hired agents who had college degrees and respectable family backgrounds. He also punished them for leaving lunch crumbs on their desks, or overlooking a typo in their memos, or arriving for work even a minute tardy. Still, as Congress passed laws expanding the list of federal crimes, his unit became the place any ambitious lawman wanted to work. Melvin Purvis was Hoover’s kind of agent. He was the son of a bank director and plantation owner in South Carolina; he left a small-town law firm to join the division in 1927. Aloof and aristocratic, with a reedy voice and a drawl, he was, like Hoover, a bit of a dandy, favoring straw hats and double-breasted suits decorated with pocket squares. Hoover made him the special agent in charge in Chicago before he was 30, and he became the director’s favorite SAC. In letters addressed to “Mel” or “Melvin,” Hoover teased him about the effect he supposedly had on women. Still, everyone knew Hoover could be mercurial, and in 1933 Purvis had reason to worry. He had run the Chicago office for less than a year. That September, he’d staked out a tavern two hours too late and blown a chance to catch the notorious bank robber Machine Gun Kelly. So when he got wind that Ness was angling for his job, he moved quickly. A lot of the information he sent to Hoover was puffed up, undocumented or tailored to appeal to the director’s prurient streak. Ness, he complained, had failed to take down Capone. (It was common knowledge then that Capone had been convicted of tax, not liquor, violations.) A disgruntled Untouchable had told him the squad held a drinking party. (If so, it was kept quiet; Prohibition Bureau personnel records mention no party-related infractions.) Ness’ family looked down on his wife, and he preferred their company to hers. (Purvis knew Hoover liked to scrutinize his agents’ fiancées or spouses and sometimes tried to break up relationships he found objectionable.) But the most incriminating part of the file came directly from one of Ness’ fellow Prohibition agents. His name was W.G. Malsie. Newly transferred to Chicago as the acting head of the Prohibition Bureau’s office there, he didn’t know Ness and wasn’t inclined to defer to his reputation. When Joe Kulak reported for questioning the day after his still was busted, Malsie wanted him to explain his protection notes. It turned out that they had been written by his friend Walter Nowicki, an elevator operator in the building where Senator Lewis kept an office. Nowicki accompanied Kulak to the interview. A transcript of the interrogation is among the documents released to me. Nowicki told Malsie he’d gotten to know an aide to Lewis on elevator rides and eventually paid him $25 to $30 to protect Kulak’s still. Twice, he said, he’d seen the aide talking with Ness. And once, in front of Ness, Nowicki asked the aide to put Kulak’s still “in a safe position.” The aide “patted Mr. Ness on the back and told him to give the boys a break,” Nowicki recalled. Then he wrote down the still’s address and gave it to Ness, who tucked it into his inside coat pocket. “What did Ness say?” Malsie asked. “He said that it would be OK,” Nowicki replied. Later, Nowicki said, he approached Ness in the building’s lobby and asked him again about Kulak’s still. “He said that if the police bothered Joe there will be no case on it,” Nowicki recalled. Erick Trickey is a writer in Boston, covering politics, history, cities, arts, and science. He has written for POLITICO Magazine, Next City, the Boston Globe, Boston Magazine, and Cleveland Magazine
568d8c75f34ceaab17e2c85512283d6f
https://www.smithsonianmag.com/history/interactive-map-visualizes-queer-geography-20th-century-america-180974306/?preview
This Interactive Map Visualizes the Queer Geography of 20th-Century America
This Interactive Map Visualizes the Queer Geography of 20th-Century America At first glance, Bob Damron’s Address Book reads like any other travel guide. Bars, restaurants, hotels and businesses are grouped by city and state, their names and addresses listed in alphabetical order. An introductory note reassures readers that the information contained within the volume is up-to-date, while classifications written in abbreviated parentheticals offer travelers additional details on specific establishments: An asterisk, for example, indicates a place is “very popular,” while the letter “D” specifies if a bar or club has space for dancing. Ostensibly universal, Damron’s handbook, first published in 1964 and still released annually, was actually directed toward a specific—and secretive—audience. As Eric Gonzaba, a historian at California State University, Fullerton, explains, Damron, a white, gay man from San Francisco, “started just writing down lists of locations that he would visit, … places [where] he either found other gay men or he felt accepted.” What began as a personal reference for the Californian and his friends soon morphed into a thriving enterprise akin to The Negro Motorist Green Book, which safely shepherded African American travelers across the country during the Jim Crow era, but for gay men and, to a lesser extent, lesbian women. Crucially, Damron’s Address Book never explicitly identified its target audience (at least until 1999, when the word “gay” was first printed on its cover), instead relying on euphemisms, innuendo and coded abbreviations to circulate information within the queer community. A new public history initiative spearheaded by Gonzaba and Amanda Regan, a historian at Southern Methodist University, is poised to bring Damron’s findings into the digital age, drawing on more than 30,000 listings compiled between 1965 and 1980 to visualize queer spaces’ evolution over time. Titled Mapping the Gay Guides, the project aims to “correct the cultural erasure of historical geography” by spotlighting local communities’ oft-unheralded queer history and, adds Gonzaba, exploring “how that community relates to other parts of the country.” The first phase of Mapping the Gay Guides launched in mid-February with a focus on the southern United States. Site visitors can browse some 7,000 entries, filtering by year; geographic location; type (among others, cruising areas, book stores, and bars or clubs); and “establishment feature,” a term coined by the researchers to describe the abbreviated designations used in Damron’s original text. Vignettes accompanying the interactive map provide historical context on the data, lending the portal what Regan calls a “layered perspective”; sections on methodology and ethics offer insights on the project’s technical side and the fraught decision-making involved in transforming a historical document into a data set. Interns and graduate students helped the researchers organize this vast trove of data, transcribing text from digitized images of the guide and rendering the entries machine-readable. The students also aided in tracking down and verifying various establishments’ locations. Mapping the Gay Guides isn’t the first digital history project dedicated to Damron’s Address Books or the numerous spin-off guides the publications spawned. But it differs from the majority of these resources in its scope—most portals focus on a specific city or region, not the entire country—and use of a single source rather than multiple. As Gonzaba explains, “This is one publisher and one guy’s view of what the gay world looked like.” In 1964, the year Damron first published his Address Book, gay sex was considered a crime in every state except Illinois, and the Stonewall Uprising, widely credited with sparking the contemporary gay rights movement, was still five years away. To ensure his work reached its intended audience, Damron plugged into existing networks within the underground gay community, adding his handbook to the array of erotica, pulp novels, physique magazines and other printed materials available to those in the know. Per Gonzaba, Damron also sent guides to establishments featured in the text so they could sell copies to patrons. “The minute you enter the gay world via one of these sites,” says Gonzaba, “ … you can possibly buy access to even more of the gay culture, [identifying] more spaces by buying this guide and being able to see other places that might be of interest to you in other cities.” According to Los Angeles magazine’s Kate Sosin, Damron visited 200 cities across 37 states in the first year of publication alone. Almost every year thereafter, he released at least one new edition of the guide, adding entries submitted by readers and revising existing listings based on his trips back to the places mentioned. In some cases, he removed businesses because police crackdowns had rendered them unsafe for queer visitors. Damron’s Address Books weren’t the only gay travel guides available during the latter half of the 20th century, but as Mapping the Gay Guides points out, “They were the original and remained the gold standard, especially for men, through the 1990s.” In 1985, Damron sold his company to Dan Delbex, a friend of current owner Gina Gatta, who published the 52nd edition of the guide last year. Six years later, he died from complications of HIV. Much about the man himself—including the nature of the job that made him traverse the country—remains enigmatic. But by identifying patterns in the body of work Damron left behind, the researchers hope to learn more about his individual character, including the implicit biases he had as a gay, white man from the progressive coastal city of San Francisco. According to Gonzaba, Damron often classified sites popular among gay African American men in the American South as not only “B” (“Blacks Frequent”), but “RT,” or “Raunchy Types”—shorthand for establishments deemed “less than reputable.” Moving forward, the team plans on determining whether Damron repeated this pairing in listings for other regions of the country or limited its use largely to the southern states, which he appears to have viewed from a coastal perspective as “wholly unsafe for queer people.” “Is this a trend only in the South,” asks Gonzaba, “or does Damron conflate black spaces with spaces of vice, spaces of unsafety, spaces of deviancy?” Mapping the Gay Guide’s main function is preserving and publicizing an overlooked, under-studied chapter in LGBTQ history. As outlined on the project’s homepage, few of the businesses detailed in the Address Books remain in existence today. Largely omitted from the historical record, the presence of bars, bathhouses and informal cruising locations is easily forgotten, rendering the “queer history of local communities [seemingly] invisible or nonexistent.” Damron’s guidebooks refute this misconception, testifying to the existence of what Gonzaba deems “thriving” gay communities in cities across the country long before Stonewall and other milestones in the gay rights movement. The texts, though clearly aimed at a male audience, also hint at the growth of lesbian communities: The number of sites labeled “G” (“Girls, but seldom exclusively”) jumps from 3 in 1965 to 98 in 1980. On a wider scale, says David Johnson, author of Buying Gay: How Physique Entrepeneurs Sparked a Movement, to Los Angeles magazine, the Address Books likely contributed to the growth of a collective sense of gay identity. “They helped knit the community together in a national way,” explains Johnson. “So it’s no longer just, you go to your local bar, but wherever you are, if you’re traveling to a big city from a small town, you can find the community.” By fall 2020, the Mapping the Gay Guides team hopes to publish listings from every state, Washington, D.C., Guam, Puerto Rico and the Virgin Islands. The researchers will also continually update the site’s “Vignettes” section. In terms of anticipated audience, the project aims to appeal to a broad base of readers. “We want this mapping project to be used by public historians, by tour guides, by local museum docents,” says Gonzaba. “… We’re hoping that by introducing these maps and these listings to places like Savannah, Georgia, or Beaumont, Texas, or somewhere in Montana, that you can add queer history to the places where people say queer history doesn’t exist.” Meilan Solly is Smithsonian magazine's assistant digital editor, humanities. Website: meilansolly.com.
0cafccaabfb6abc840035b38d19efc65
https://www.smithsonianmag.com/history/interactive-maps-out-lives-former-presidents-180961861/
This Interactive Maps Out the Lives of Former Presidents
This Interactive Maps Out the Lives of Former Presidents After leaving the highest elected office in the nation, what's a President of the United States to do? What can top a position as the most leader of the free world? Thirty-five of the 43 presidents have gotten to experience life after holding office (President Obama will be the thirty-sixth.) George Washington set the tradition of retiring from public life after two terms in the presidency. Some of the earlier presidents retired back to their farms and homes, choosing to remain retired from public life. Others used the time off to write their memoirs. And then there were the former presidents who sought to continue exerting their influence instead—whether by authoring a new state constitution like James Monroe, serving in the United States House of Representatives like John Quincy Adams and Andrew Johnson, or becoming Chief Justice of the United States like William Howard Taft. John Tyler, uniquely, served in the Confederate House of Representatives until his death, when he became the only former president buried under a foreign nation's flag. For some, retirement wasn't a blessing, the emptiness leaving them wanting more. Martin Van Buren ran on the 1848 Free Soil Party ticket. Millard Fillmore joined the Know Nothing Party for the 1856 presidential election. And in 1912, Theodore Roosevelt split the Republican vote by running on the Progressive Bull Moose Party ticket. All three former presidents were unsuccessful in their third-party runs, but one—Grover Cleveland—managed to make it back to the White House after a four-year hiatus. In the modern era, Americans have seen presidents live longer after their presidential terms and remain highly involved in public life, with a few exceptions. Between building their presidential libraries, contributing to various humanitarian efforts and contending with a 24-hour news cycle, staying out of the public eye proves difficult for all of the surviving former presidents It is yet to be seen how the soon-to-be-retired president Barack Obama will choose to spend his retirement years. Eight commanders-in-chief, though, never had the chance to see what life after the presidency held. Four were assassinated in office and four died of natural causes, including the sad story of William Henry Harrison who was only president for 30 days. Richard Nixon, resigned in infamy, enjoying not a retirement but a public exile, of sorts. Here's how all of the presidents who survived their turn in office did after they left:
e3be8080a3f75a7a987b5bef64685858
https://www.smithsonianmag.com/history/intergalactic-battle-ancient-rome-180961416/
The Intergalactic Battle of Ancient Rome
The Intergalactic Battle of Ancient Rome A long time ago, in a world not so far away, a young man who longed for adventure was swept up in a galactic war. Forced to choose between two sides in the deadly battle, he befriended a group of scrappy fighters who captained… three-headed vultures, giant fleas and space spiders? Nearly 2,000 years before George Lucas created his epic space opera Star Wars, Lucian of Samosata (a province in modern-day Turkey) wrote the world’s first novel featuring space travel and interplanetary battles. True History was published around 175 CE during the height of the Roman Empire. Lucian’s space adventure features a group of travelers who leave Earth when their ship is thrown into the sky by a ferocious whirlwind. After seven days of sailing through the air they arrive on the Moon, only to learn its inhabitants are at war with the people of the Sun. Both parties are fighting for control of a colony on the Morning Star (the planet we today call Venus). The warriors for the Sun and Moon armies travel through space on winged acorns and giant gnats and horses as big as ships, armed with outlandish weapons like slingshots that used enormous turnips as ammunition. Thousands die during the battle, and blood “[falls] upon the clouds, which made them look of a red color; as sometimes they appear to us about sun-setting,” Lucian wrote. After the war’s conclusion, Lucian and his friends continue traveling through space, learning about the Moon’s odd inhabitants (an all-male society, whose anatomy included a single toe instead of a whole foot and children cut from their calves) before moving on to visit the Morning Star and other space cities. Lucian was more of a satirist than a novelist; True History was written as a critique of philosophers and historians, and their ways of thinking about new discoveries. As scholar Roy Arthur Swanson writes, Lucian’s work provided “the perennially necessary reminder that thinking and believing are different and distinct kinds of mental activity and that it is best not to confuse them.” But being a work of satire doesn’t preclude True History from joining the ranks of science fiction. In addition to showing first contact, wars in space, and a flight to the moon, the work’s satirical nature is actually yet another thing it has in common with the genre’s modern form. “One of the consistent themes of sci-fi is satire, and making fun of the way humans live and run the world,” says Aaron Parrett, professor of English at University of Great Falls in Montana. “That is one reason why Lucian is so important. He did that very thing.” Lucian was also likely aware of major scientific and philosophical research of his time, including Plutarch’s “On the Face in the Orb of the Moon,” and Ptolemy’s last recorded observation of the planets, which occurred 14 years prior to Lucian’s publication. Still, the astronomical telescope wasn’t invented until 1610, and Lucian’s narrative doesn’t feature scientifically sound space travel. Does that mean it doesn’t count as an early form of the genre? It depends who you ask. Douglas Dunlop, who works as a metadata librarian at Smithsonian Libraries, sees parallels between Lucian’s writing and that of the later science fiction writers like Jules Verne and H.G. Wells. “Just because it doesn’t have what we would call ‘modern science’ doesn’t take away from the fact that [philosophy and natural sciences] influenced the writing,” Dunlop says. “There was a theory called Plurality of Worlds that goes back to Greek Antiquity, which was the concept of life existing in space. So who’s to say what they were doing in their philosophy and observation wasn’t informing their understanding of the world around them?” Other literary scholars have posited the world of science fiction starts with the Epic of Gilgamesh (2100 B.C.), Frankenstein (1818), or the works of Jules Verne (1850s). For famous American astronomer Carl Sagan, sci-fi starts with Johannes Kepler’s novel Somnium (1634), which describes a trip to the moon and the view of Earth seen from far away. But Kepler, as it turns out, was partially inspired by Lucian. He picked up True History in the original Greek to master the language. (While Latin was the vernacular of ancient Rome, Greek was the language used by the educated elite.) He wrote that his studies were improved by his enjoyment of the adventure, and it seems to have sent his imagination spinning as well. “These were my first traces of a trip to the moon, which was my aspiration at later times,” Kepler wrote. Genre requirements aside, both True History and Star Wars offer ways of understanding and exploring the human world, even though the stories take place in the stars. “One of the great things that science fiction does as a way of changing people’s worldview is show what the world might be like,” Parrett says. “It’s remarkable that people dreamed up things long before there was any possibility that they could do it. This is true not just of flying to the moon, but of flying in general.” Lucian may never have believed humans would achieve flight to the moon—but he imagined it. And the path he laid for intergalactic stories continues to send writers, scientists and movie-goers dreaming of what might be out there, just beyond our reach. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
b142bc1f434e9d5b62fd984e04bbd51d
https://www.smithsonianmag.com/history/inventing-alphabet-180976520/
Who Invented the Alphabet?
Who Invented the Alphabet? Centuries before Moses wandered in the “great and terrible wilderness” of the Sinai Peninsula, this triangle of desert wedged between Africa and Asia attracted speculators, drawn by rich mineral deposits hidden in the rocks. And it was on one of these expeditions, around 4,000 years ago, that some mysterious person or group took a bold step that, in retrospect, was truly revolutionary. Scratched on the wall of a mine is the very first attempt at something we use every day: the alphabet. The evidence, which continues to be examined and reinterpreted 116 years after its discovery, is on a windswept plateau in Egypt called Serabit el-Khadim, a remote spot even by Sinai standards. Yet it wasn’t too difficult for even ancient Egyptians to reach, as the presence of a temple right at the top shows. When I visited in 2019, I looked out over the desolate, beautiful landscape from the summit and realized I was seeing the same view the inventors of the alphabet had seen every day. The temple is built into the living rock, dedicated to Hathor, the goddess of turquoise (among many other things); stelae chiseled with hieroglyphs line the paths to the shrine, where archaeological evidence indicates there was once an extensive temple complex. A mile or so southwest of the temple is the source of all ancient interest in this area: embedded in the rock are nodules of turquoise, a stone that symbolized rebirth, a vital motif in Egyptian culture and the color that decorated the walls of their lavish tombs. Turquoise is why Egyptian elites sent expeditions from the mainland here, a project that began around 2,800 B.C. and lasted for over a thousand years. Expeditions made offerings to Hathor in hopes of a rich haul to take home. In 1905, a couple of Egyptologists, Sir William and Hilda Flinders Petrie, who were married, first excavated the temple, documenting thousands of votive offerings there. The pair also discovered curious signs on the side of a mine, and began to notice them elsewhere, on walls and small statues. Some signs were clearly related to hieroglyphs, yet they were simpler than the beautiful pictorial Egyptian script on the temple walls. The Petries recognized the signs as an alphabet, though decoding the letters would take another decade, and tracing the source of the invention far longer. The Flinders Petries brought many of the prizes they had unearthed back to London, including a small, red sandstone sphinx with the same handful of letters on its side as those seen in the mines. After ten years of studying the inscriptions, in 1916 the Egyptologist Sir Alan Gardiner published his transcription of the letters and their translation: An inscription on the little sphinx, written in a Semitic dialect, read “Beloved of Ba’alat,” referring to the Canaanite goddess, consort of Ba’al, the powerful Canaanite god. “For me, it’s worth all the gold in Egypt,” the Israeli Egyptologist Orly Goldwasser said of this little sphinx when we viewed it at the British Museum in late 2018. She had come to London to be interviewed for a BBC documentary about the history of writing. In the high-ceilinged Egypt and Sudan study room lined with bookcases, separated from the crowds in the public galleries by locked doors and iron staircases, a curator brought the sphinx out of its basket and placed it on a table, where Goldwasser and I marveled at it. “Every word we read and write started with him and his friends.” She explained how miners on Sinai would have gone about transforming a hieroglyph into a letter: “Call the picture by name, pick up only the first sound and discard the picture from your mind.” Thus, the hieroglyph for an ox, aleph, helped give a shape to the letter “a,” while the alphabet’s inventors derived “b” from the hieroglyph for “house,” bêt. These first two signs came to form the name of the system itself: alphabet. Some letters were borrowed from hieroglyphs, others drawn from life, until all the sounds of the language they spoke could be represented in written form. The temple complex detailed evidence of the people who worked on these Egyptian turquoise excavations in the Sinai. The stelae that line the paths record each expedition, including the names and jobs of every person working on the site. The bureaucratic nature of Egyptian society yields, today, a clear picture of the immigrant labor that flocked to Egypt seeking work four millennia ago. As Goldwasser puts it, Egypt was “the America of the old world.” We can read about this arrangement in Genesis, when Jacob, “who dwelt in the land of Canaan”—that is, along the Levant coast, east of Egypt—traveled to Egypt to seek his fortune. Along with shepherds like Jacob, other Canaanites ended up mining for the Egyptian elites in Serabit, some 210 miles southeast by land from Memphis, the seat of pharaonic power. Religious ritual played a central role in inspiring foreign workers to learn to write. After a day’s work was done, Canaanite workers would have observed their Egyptian counterparts’ rituals in the beautiful temple complex to Hathor, and they would have marveled at the thousands of hieroglyphs used to dedicate gifts to the goddess. In Goldwasser’s account, they were not daunted by being unable to read the hieroglyphs around them; instead, they began writing things their own way, inventing a simpler, more versatile system to offer their own religious invocations. The alphabet remained on the cultural periphery of the Mediterranean until six centuries or more after its invention, seen only in words scratched on objects found across the Middle East, such as daggers and pottery, not in any bureaucracy or literature. But then, around 1200 B.C., came huge political upheavals, known as the late Bronze Age collapse. The major empires of the near east—the Mycenaean Empire in Greece, the Hittite Empire in Turkey and the ancient Egyptian Empire—all disintegrated amid internal civil strife, invasions and droughts. With the emergence of smaller city-states, local leaders began to use local languages to govern. In the land of Canaan, these were Semitic dialects, written down using alphabets derived from the Sinai mines. These Canaanite city-states flourished, and a bustling sea trade spread their alphabet along with their wares. Variations of the alphabet—now known as Phoenician, from the Greek word for the Canaanite region—have been found from Turkey to Spain, and survive until today in the form of the letters used and passed on by the Greeks and the Romans. In the century since the discovery of those first scratched letters in the Sinai mines, the reigning academic consensus has been that highly educated people must have created the alphabet. But Goldwasser’s research is upending that notion. She suggests that it was actually a group of illiterate Canaanite miners who made the breakthrough, unversed in hieroglyphs and unable to speak Egyptian but inspired by the pictorial writing they saw around them. In this view, one of civilization’s most profound and most revolutionary intellectual creations came not from an educated elite but from illiterate laborers, who usually get written out of history. Pierre Tallet, former president of the French Society of Egyptology, supports Goldwasser’s theory: “Of course [the theory] makes sense, as it is clear that whoever wrote these inscriptions in the Sinai did not know hieroglyphs,” he told me. “And the words they are writing are in a Semitic language, so they must have been Canaanites, who we know were there from the Egyptians’ own written record here in the temple.” There are doubters, though. Christopher Rollston, a Hebrew scholar at George Washington University, argues that the mysterious writers likely knew hieroglyphs. “It would be improbable that illiterate miners were capable of, or responsible for, the invention of the alphabet,” he says. But this objection seems less persuasive than Goldwasser’s account—if Egyptian scribes invented the alphabet, why did it promptly disappear from their literature for roughly 600 years? Besides, as Goldwasser points out, the close connection between pictograms and text would seem to be evident all around us, even in our hyper-literate age, in the form of emojis. She uses emojis liberally in her emails and text messages, and has argued that they fulfill a social need the ancient Egyptians would have understood. “Emojis actually brought modern society something important: We feel the loss of images, we long for them, and with emojis we have brought a little bit of the ancient Egyptian games into our lives.” This article is a selection from the January/February issue of Smithsonian magazine Lydia Wilson is a research associate at the Computer Laboratory at the University of Cambridge and a visiting scholar at the Ralph Bunche Institute for International Studies at City University New York. She edits the Cambridge Literary Review, writes for both academic and popular publications and recently hosted the BBC series A Secret History of Writing.
5208e58c92426be1744ba6d1ad445598
https://www.smithsonianmag.com/history/islams-medieval-underworld-15821520/
Islam’s Medieval Underworld
Islam’s Medieval Underworld The year is—let us say—1170, and you are the leader of a city watch in medieval Persia. Patrolling the dangerous alleyways in the small hours of the morning, you and your men chance upon two or three shady-looking characters loitering outside the home of a wealthy merchant. Suspecting that you have stumbled across a gang of housebreakers, you order them searched. From various hidden pockets in the suspects’ robes, your men produce a candle, a crowbar, stale bread, an iron spike, a drill, a bag of sand—and a live tortoise. The reptile is, of course, the clincher. There are a hundred and one reasons why an honest man might be carrying a crowbar and a drill at three in the morning, but only a gang of experienced burglars would be abroad at such an hour equipped with a tortoise. It was a vital tool in the Persian criminals’ armory, used—after the iron spike had made a breach in a victim’s dried-mud wall—to explore the property’s interior. We know this improbable bit of information because burglars were members of a loose fraternity of rogues, vagabonds, wandering poets and outright criminals who made up Islam’s medieval underworld. This broad group was known collectively as the Banu Sasan, and for half a dozen centuries its members might be encountered anywhere from Umayyad Spain to the Chinese border. Possessing their own tactics, tricks and slang, the Banu Sasan comprised a hidden counterpoint to the surface glories of Islam’s golden age. They were also celebrated as the subjects of a scattering of little-known but fascinating manuscripts that chronicled their lives, morals and methods. According to Clifford Bosworth, a British historian who has made a special study of the Banu Sasan, this motley collection of burglars’ tools had some very precise uses: The thieves who work by tunneling into houses and by murderous assaults are much tougher eggs, quite ready to kill or be killed in the course of their criminal activities. They necessarily use quite complex equipment… are used for the work of breaking through walls, and the crowbar for forcing open doors; then, once a breach is made, the burglar pokes a stick with a cloth on the end into the hole, because if he pokes his own head through the gap, might well be the target for the staff, club or sword of the houseowner lurking on the other side. The tortoise is employed thus. The burglar has with him a flint-stone and a candle about as big as a little finger. He lights the candle and sticks it on the tortoise’s back. The tortoise is then introduced through the breach into the house, and it crawls slowly around, thereby illuminating the house and its contents. The bag of sand is used by the burglar when he has made his breach in the wall. From this bag, he throws out handfuls of sand at intervals, and if no-one stirs within the house, he then enters it and steals from it; apparently the object of the sand is either to waken anyone within the house when it is thrown down, or else to make a tell-tale crushing noise should any of the occupants stir within it. Also, the burglar may have with him some crusts of dry bread and beans. If he wishes to conceal his presence, or hide any noise he is making, he gnaws and munches at these crusts and beans, so that the occupants of the house think that it is merely the cat devouring a rat or mouse. As this passage hints, there is much about the Banu Sasan that remains a matter of conjecture. This is because our knowledge of the Islamic underworld comes from only a handful of surviving sources. The overwhelming mass of Arabic literature, as Bosworth points out, “is set in a classical mold, the product of authors writing in urban centers and at courts for their patrons.” Almost nothing written about daily life, or the mass of the people, survives from earlier than the ninth century (that is, the third century AH), and even after that date the information is very incomplete. It is not at all certain, for example, how the Banu Sasan came by their name. The surviving sources mention two incompatible traditions. The first is that Islamic criminals were considered to be followers—”sons”— of a (presumably legendary) Sheikh Sasan, a Persian prince who was displaced from his rightful place in the succession and took to living a wandering life. The second is that the name is a corrupted version of Sasanid, the name of the old ruling dynasty of Persia that the Arabs destroyed midway through the seventh century. Rule by alien conquerors, the theory goes, reduced many Persians to the level of outcasts and beggars, and forced them to live by their wits. There is no way now of knowing which of these tales, if either, is rooted in truth. What we can say is that the term “Banu Sasan” was once in widespread use. It crops up to describe criminals of every stripe, and also seems to have been acknowledged, and indeed used with pride, by the villains of this period. Who were they, then, these criminals of Islam’s golden age? The majority, Bosworth says, seem to have been tricksters of one sort or another, who used the Islamic religion as a cloak for their predatory ways, well aware that the purse-strings of the faithful could easily be loosed by the eloquence of the man who claims to be an ascetic or or mystic, or a worker of miracles and wonders, to be selling relics of the Muslim martyrs and holy men, or to have undergone a spectacular conversion from the purblindness of Christianity or Judaism to the clear light of the faith of Muhammad. Amira Bennison identifies several adaptable rogues of this type, who could “tell Christian, Jewish or Muslim tales depending on their audience, often aided by an assistant in the audience who would ‘oh’ and ‘ah’ at the right moments and collect contributions in return for a share of the profits,” and who thought nothing of singing the praises of both Ali and Abu Bakr—men whose memories were sacred to the Shia and the Sunni sects, respectively. Some members of this group would eventually adopt more legitimate professions—representatives of the Banu Sasan were among the first and greatest promoters of printing in the Islamic world—but for most, their way of life was something they took pride in. One of the best-known examples of the maqamat (popular) literature that flourished from around 900 tells the tale of Abu Dulaf al-Khazraji, the self-proclaimed king of vagabonds, who secured a tenuous position among the entourage of a 10th-century vizier of Isfahan, Ibn Abbad, by telling sordid, titillating, tales of the underworld. “I am of the company of beggar lords,” Abu Dulaf boasts in one account, the cofraternity of the outstanding ones, One of the Banu Sasan… And the sweetest way of life we have experienced is one spent in sexual indulgence and wine drinking. For we are the lads, the only lads who really matter, on land and sea. In this sense, of course, the Banu Sasan were merely the Middle Eastern equivalents of rogues who have always existed in every culture and under the banner of every religion; Christian Europe had equivalents enough, as Chaucer’s Pardoner can testify. Yet the criminals produced by medieval Islam seem to have been especially resourceful and ingenious. Ismail El Outamani suggests that this was because the Banu Sasan were a product of an urbanization that was all but unknown west of Constantinople at this time. The Abbasid caliphate’s capital, Baghdad, had a population that peaked at perhaps half a million in the days of Haroun al-Rashid (c.763-809), the sultan depicted in the Thousand and One Nights–large and wealthy enough to offer crooks the sort of wide variety of opportunities that encouraged specialization. But membership of the fraternity was defined by custom as much as it was by criminal inclination; poets, El Outmani reminds us, literally and legally became rogues whenever a patron dispensed with their services. While most members of the Banu Sasan appear to have lived and worked in cities, they also cropped up in more rural areas, and even in the scarcely populated deserts of the region. The so-called prince of camel thieves, for instance—one Shaiban bin Shihab—developed the novel technique of releasing a container filled with voracious camel ticks on the edges of an encampment. When the panicked beasts of burden scattered, he would seize his chance and steal as many as he could. To immobilize any watchdogs in the area, other members of the Banu Sasan would “feed them a sticky mixture of oil-dregs and hair clippings”—the contemporary writer Damiri notes—”which clogs their teeth and jams up their jaws.” The best-known of the writers who describe the Banu Sasan is Al-Jahiz, a noted scholar and prose stylist who may have been of Ethiopian extraction, but who lived and wrote in the heartland of the Abbasid caliphate in the first half of the ninth century. Less well known, but of still greater importance, is the Kashf al-asrar, an obscure work by the Syrian writer Jaubari that dates to around 1235. This short book—the title can be translated as Unveiling of Secrets—is in effect a guide to the methods of the Banu Sasan, written expressly to put its readers on guard against tricksters and swindlers. It is a mine of information concerning the methods of the Islamic underworld, and is plainly the result of considerable research; at one point Jaubari tells us that he studied several hundred works in order to produce his own; at another, he notes that he has uncovered 600 stratagems and tricks used by housebreakers alone. In all, Jaubari sets out 30 chapters’ worth of information on the methods of everyone from crooked jewelers—whom he says had 47 different ways of manufacturing false diamonds and emeralds—to alchemists with their “300 ways of dakk” (falsification). He details the way in which money-changers wore magnetized rings to deflect the indicator on their scales, or used rigged balances filled with mercury, which artificially inflated the weight of the gold that was placed on them. Our sources are united in suggesting that a large proportion of the Banu Sasan were Kurds, a people seen by other Middle Eastern peoples as brigands and predators. They also show that the criminal slang they employed drew on a wide variety of languages. Much of it has its origins in what Johann Fück has termed “Middle Arabic,” but the remainder seems to be derived from everything from Byzantine Greek to Persian, Hebrew and Syriac. This is a useful reminder not only of what a cosmopolitan place western Asia was during the years of the early Islamic ascendancy, but also that much criminal slang has its origins in the requirement to be obscure—most obviously because there is often an urgent need to hide what was being discussed from listeners who might report the speakers to the police. Ultimately, however, what strikes one most about the Banu Sasan is their remarkable inclusiveness. At one extreme lie the men of violence; another of Bosworth’s sources, ar-Raghib al-Isfahani, lists five separate categories of thug, from the housebreaker to out-and-out killers such as the sahib ba’j, the “disemboweler and ripper-open of bellies,” and the sahib radkh, the “crusher and pounder” who accompanies lone travelers on their journeys and then, when his victim has prostrated himself in prayer, “creeps up and hits him simultaneously over the head with two smooth stones.” At the other lie the poets, among them the mysterious Al-Ukbari—of whom we are told little more than that he was “the poet of rogues, their elegant exponent and the wittiest of them all.” In his writings, Al-Ukbari frankly admitted that he could not “earn any sort of living through philosophy or poetry, but only through trickery.” And among the meager haul of 34 surviving stanzas of his verse can be found this defiant statement: Nevertheless I am, God be praised, A member of a noble house, Through my brethren the Banu Sasan, The influential and bold ones… When the roads become difficult for both The night travelers and the soldiery, on the alert against their enemies, The Bedouins and the Kurds, We sail forward along that way, without The need of sword or even of scabbard, And the person who fears his foes seeks Refuge by means of us, in his terror. Sources Amira Bennison. The Great Caliphs: the Golden Age of the ‘Abbasid Empire. London: IB Tauris, 2009; Clifford Bosworth. The Medieval Islamic Underworld: The Banu Sasan in Arabic Society and Literature. Leiden, 2 vols.: E.J. Brill, 1976; Richard Bullet. What Life Was Like in the Lands of the Prophet: Islamic World, AD570-1405. New York: Time-Life, 1999; Ismail El Outmani. “Introduction to Arabic ‘carnivalised’ literature.” In Concepción Vázquez de Benito & Miguel Ángel Manzano Rodríguez (eds). Actas XVI Congreso Ueai. Salamanca: Gráficas Varona, nd (c.1995); Li Guo. The Performing Arts in Medieval Islam: Shadow Play and Popular Poetry in Ibn Daniyal’s Mamluk Cairo. Leiden: Brill, 2012; Ahmad Ghabin. Hjsba, Arts & Crafts in Islam. Wiesbaden: Otto Harrassowitz, 2009; Robert Irwin. The Penguin Anthology of Classical Arabic Literature. London: Penguin, 1999; Adam Sabra. Poverty and Charity in Medieval Islam: Mamluk Egypt, 1250-1517. Cambridge: Cambridge University Press, 2000. Mike Dash is a contributing writer in history for Smithsonian.com. Before Smithsonian.com, Dash authored the award-winning blog A Blast From the Past.
5c6a182c07a10fdfa611ca9c3513ffd8
https://www.smithsonianmag.com/history/its-a-wurlitzer-61398212/
It’s a Wurlitzer
It’s a Wurlitzer Of all the musical instruments in the Smithsonian Institution’s collection of 5,200 violins, pianos, banjos and others, the largest—it fills three rooms—represents a unique period of nearly forgotten American history. It’s a Wurlitzer theater organ. In the early 20th century, thousands of these gigantic pipe organs were installed in movie theaters throughout the United States, Canada, England and Australia to accompany silent movies. This one worked its wonders in the Fox Theatre in Appleton, Wisconsin. The Smithsonian’s instrument is a rare, completely original Wurlitzer donated by the estate of Lowell Ayars, a New Jersey music teacher, in 1993. Ayars kept it in museum-quality condition during the 30-some years it was played in his home. When Ayars died in 1992, he willed it to his friend Brantley Duddy, and Duddy contacted the Smithsonian, which gratefully accepted it for the musical instrument collection of the National Museum of American History. For now, it sits in storage, its burnished white-and-gold console protected by a sheet of plastic. But there are plans to restore it to glory. The Ayars organ, a Model 190 (serial number 2070), was built by the Rudolph Wurlitzer Company of North Tonawanda, New York, in 1929 for the Fox Theatre. After the theater became a department store in 1959, the organ briefly went into storage until Ayars bought it and installed it in his New Jersey home. As theater organs go, this one is modest in size, its pipes fitting into a space about 15 feet wide and 13 feet deep. It sports two keyboards (called manuals), 584 individual pipes organized into eight ranks, and four tuned percussion instruments as well as special effects. The largest original Wurlitzer still in operation—with more than 4,000 pipes in 58 ranks, ranging from 32 feet in length to the size of a pencil—is also the most famous: the Radio City Music Hall Wurlitzer in New York City, which was installed in 1932. Between 1911 and 1943, the Rudolph Wurlitzer Company built more than 2,000 theater organs, most of them about the size of the Ayars, for smaller, neighborhood theaters. The first silent films had been accompanied by a pit orchestra or, for the more frugally minded impresario, a lone piano. When the theater organ came along, with its ability to imitate an orchestra and create special sound effects, every movie house owner had to have one. At its peak in 1926, the company was shipping a Wurlitzer a day, mass-producing one of the most technologically advanced machines of its time. The theater organ is related to the classic church pipe organ, whose basic design has been around for more than 2,000 years. Air blown through pipes, each tuned to create a different musical tone, creates the sound. Blowers located under the ranks, or sets of pipes, force air into them when valves are opened as the organist plays the keys and stops (tabs the organist flips up or down to activate different ranks of pipes). In a church organ, this rather simple mechanism can produce only a certain number of sounds. To the dismay of lovers of the traditional organ, British inventor and telephone engineer Robert Hope-Jones electrified it and created a switching system to allow any combination of pipes and effects to be played at once. His instruments could produce numerous inventive sound effects, including train and boat whistles, car horns and bird whistles, and some could even simulate pistol shots, ringing phones, the sound of surf, horses’ hooves, smashing pottery, thunder and rain. The new organs either incorporated or at least imitated other musical instruments—from piano and violin to trumpet, drums, cymbals, even bells and chimes. Hope-Jones dubbed it the Unit Orchestra: with it an organist could imitate an entire dance band or orchestra. In 1910, after his company foundered, Hope-Jones was bought out by the Wurlitzer Company, which, with elegant-looking products and aggressive advertising, dominated the theater organ market. Even today, many people remember the slogan: "Gee Dad, it’s a Wurlitzer." Wurlitzer’s time in the limelight was brief. The sound of Al Jolson’s voice in The Jazz Singer of 1927 spelled doom for the theater organ. Soon Hollywood was putting sound in every movie it produced. By the mid-1930s, most theater owners had replaced their organs with speaker systems. Of the more than 5,000 organs manufactured in the early 1900s, only a few hundred remain in public venues; a few others, like the Ayars organ, were rescued by private collectors. Only a handful are in their original theater installations. Richmond, Virginia, has three theaters with original organs, the Chicago Theatre still has its Wurlitzer, and some of the truly grand movie palaces have original organ installations, including the Fox Theatres in Atlanta, St. Louis and Detroit and the Orpheum in Los Angeles. Forty years ago, Carsten Henningson, owner of Ye Olde Pizza Joynt in Hayward, California, and a devoted organ enthusiast, decided a Wurlitzer might help boost business. It did just that, and the phenomenon spread throughout the state and beyond as dozens of moribund theater organs found new lives in restaurants. At one such venue—the Bella Roma Pizza restaurant in Martinez, California—on a recent Sunday night, organist Kevin King put a Wurlitzer through its paces, bouncing in his seat as his hands played different keyboards, occasionally pausing to flip stops, while his feet plied the pedals. "You’re playing all the orchestra sounds plus some real instruments," he says. Musical historians and theater organ buffs would like to see the Smithsonian’s Wurlitzer played publicly once again. Exhibits specialist and theater organist Brian Jensen helped bring the organ to the Institution. "Ours does not have all the bells and whistles of the larger organs found in big cities," says Jensen, "but it represents what was in 90 percent of the theaters across the country, in neighborhoods and smaller towns. Like the Star-Spangled Banner, it’s a recognized symbol of American culture."
340fafa40ca99970299e18afbadcada0
https://www.smithsonianmag.com/history/its-time-to-cut-barbie-a-little-slack-4110448/
It’s Time to Cut Barbie a Little Slack
It’s Time to Cut Barbie a Little Slack She’s wearing entirely too much eyeliner. When the Mattel company introduced Barbie to the world, in 1959, she wore a black-and-white striped one-piece bathing suit, black heels, white sunglasses and...entirely too much eyeliner. The makeup was no doubt applied because Barbie was meant to be older than traditional dolls marketed to preteens. Here, at long last, was a modern gal who could hold down a job, date and drive. Of course, despite these life skills, Barbie’s most consistent feat turned out to be stirring up controversy. One Barbie doll is sold every three seconds somewhere in the world. No one that popular is universally adored. Barbie has long drawn criticism for her unrealistic—nay, fatal, if applied to any human counterpart—proportions as well as her role as Forewoman of the Gender Stereotype Factory. In addition to plastic combs and hand mirrors, she comes with a litany of feminist faux pas. As recently as 1991, Barbie pronounced, via a small speaker embedded in her abdomen, that “math class is tough” and “party dresses are fun.” The following year brought the best-selling Barbie doll of all time, Totally Hair Barbie. Hair she tied back when she appeared in the exercise tape “Dance! Workout With Barbie!” This, even though Barbie, being a doll, has little need for cardio, and impressionable preteen girls already dying to look like the models they see in magazines have even less for it. But 20 years later, is Barbie really such a menace to society? Or is she an institution of plastic Americana, a blank slate on which we’ve superimposed half the population’s challenges? As an American woman (a child of the mid-’80s, I was weaned on Barbie and the Rockers), I have officially decided to cut Barbie a little slack. Terrible makeup and all. We live in a world where Barbie is no longer forced to shoulder the burden of American female self-esteem by herself, just as G.I. Joe can no longer be faulted for promoting youth violence when there’s a computer and an Xbox at hand. Good old Barbie seems relatively harmless compared with, say, the entire catalog of reality television. There is something incredibly appealing about Barbie’s size (not her proportions, mind you) to the animated hand of a little girl during playtime. You can get a real grip on Barbie, safe in the knowledge that she won’t slump over as you bob her back and forth in conversation. And those conversations, especially concerning Ken, can get heated. I remember getting so irate with a friend’s Barbie during a play date that I had my Barbie march off, jump into her Barbie Corvette, put her webbed foot on the gas pedal and drive straight to the living room. Try doing that with an American Girl or a Polly Pocket. For all of Barbie’s girly reputation, she’s built for real emotions, for backyard adventures and roughhousing. Also to her credit? Math wouldn’t always be tough. The woman has managed to hold down over 130 careers. Besides the aerobics instructing and lifeguarding, she has also been an astronaut, a presidential candidate, an architect, an engineer, a doctor and a paleontologist. Sure, her longest-standing career has been that of fashion model, but you try having the same job for five decades and see if you don’t start dabbling in firefighting and dentistry. Ultimately, half the fun of Barbie is imagining her as a single woman with all these careers, a filter for the changing desires of girls, even if Barbie’s progressive accomplishments are just as unrealistic as her antiquated ones. No woman has those hips and that rib cage, and no woman has designed an airplane and piloted it while simultaneously serving drinks and snacks in the main cabin. Though I suppose if anyone could do it, Barbie could. The author of two best-selling collections of essays, I Was Told There’d Be Cake and How Did You Get This Number, Sloane Crosley fondly remembers playing with her own Barbie doll. “By the time I was playing with Barbie—in the late ’80s and early ’90s—she was really a canvas for her owner’s personality,” she says. “So one of my Barbies sped around the house in a Corvette, managed a clothing store and dressed up like an Eskimo before spending the night in the refrigerator.” Crosley’s first novel, The Clasp, will be published in 2015.
842004c0757e10b98b7d50c726e885c4
https://www.smithsonianmag.com/history/jacques-louis-david-60191520/
Jacques-Louis David
Jacques-Louis David The savagery of the French Revolution, which declared the Rights of Man but turned to bloody-handed tyranny and repressive terrorism, has long puzzled historians. Among the list of causes, and one rarely remembered, Elizabeth Wilson writes, was the painter Jacques-Louis David. Today, he is best known as one of the great masters of French painting — a defining master of an austere neoclassical style that dominated European art for almost a half-century — and one of the precursors of modern painting. But for a few terrifying years David was also "the propaganda minister of the French Revolutio — a man who could turn an unruly mob, ready to kill for a loaf of bread, into tearful patriots willing to die for the cause." Wilson's story traces David's life and work, his great ambition and success. That success was mostly nonpolitical until 1785, when one of his monumental and posterish neoclassical paintings, The Oath of the Horatii, in which three brothers swear to fight to the death for their homeland, became linked to patriotic fervor as the Revolution was about to get under way. David went on not only to document the Tennis Court Oath, when the Revolution more or less officially began, but to produce on demand "state funerals and martyr portraits, multimedia pageants with a cast of thousands — all designed to keep the revolutionary faith alive, even when bodies were piling up ten deep beside "la guillotine." His most startling picture, and one that links him most clearly to modern painting, is the martyr portrait of Jacobin leader Jean-Paul Marat, dead in his bath after being stabbed by Charlotte Corday. The guillotine devoured many revolutionary leaders, and, indeed, David had declared he wanted to die with Robespierre, the principal architect of the Terror. But he survived, instead, and soon began fawning upon the young Napoleon. David was a turncoat and a sycophant, but a great painter. "He was born into a world in which painting was for the privileged few," Wilson writes. "His images showed the power of art to electrify even the commonest citizen."
8beac626e609b7bb72582a9df3005c81
https://www.smithsonianmag.com/history/john-adams-out-thomas-jefferson-sally-hemings-180960789/
Did John Adams Out Thomas Jefferson and Sally Hemings?
Did John Adams Out Thomas Jefferson and Sally Hemings? The first eight months of 1802 were mercifully dull for President Jefferson. France and England signed a peace treaty, reopening European and Caribbean ports to American commerce. The Navy was making headway against Barbary pirates in the Mediterranean. West Point was established. A prime concern was paying off the national debt. The bitter election of 1800 was fading from memory. Thomas Jefferson and Sally Hemings: An American Controversy Then, in the September 1 issue of the Richmond Recorder, James Callender, a notorious journalist, reported that the president of the United States had a black slave mistress who had borne him a number of children. “IT is well known that the man, whom it delighteth the people to honor, keeps, and for many years past has kept, as his concubine, one of his own slaves,” the story began. “Her name is SALLY.” Federalist newspapers from Maine to Georgia reprinted the story. Racist poems were published about the president and “Dusky Sally.” Jefferson’s defenders were more muted, waiting in vain for the denial that never came from the Executive Mansion. The scandal rocked the fledgling nation. How “well known” was the relationship between Jefferson and Hemings? Callender wrote that it had “once or twice been hinted at” in newspapers, as indeed it was in 1800 and 1801. And in reaction to his muckraking, the Gazette of the United States said it had “heard the same subject freely spoken of in Virginia, and by Virginia Gentlemen.” But while scholars have combed the sources, they have identified no specific written reference to the Jefferson-Hemings liaison prior to the appearance of Callender’s scandalous report. I believe I have found two such references. They precede the exposé by more than eight years, and they come from the pen of none other than Jefferson’s old friend and political rival John Adams. In letters to his sons Charles and John Quincy in January of 1794, Adams points to the relationship between the sage of Monticello and the beautiful young woman known around the plantation as “Dashing Sally.” The references have escaped notice until now because Adams used a classical allusion whose significance historians and biographers have failed to appreciate. Adams’ letters offer tangible evidence that at least one of the country’s leading political families was aware of the Jefferson-Hemings relationship long before the scandal broke. The documents cast new light on the question of elite awareness of the relationship, on the nature of the press in the early republic, and on Adams himself. This article is a selection from the November issue of Smithsonian magazine ********** Jefferson resigned as George Washington’s secretary of state on the last day of 1793. It had not been a good year. His efforts to force his hated rival Alexander Hamilton out of the cabinet for financial misconduct failed miserably. Continuing to support the French Revolution despite the guillotining of the king and queen and the blossoming of the Terror, he alienated Adams and was disappointed by Washington’s proclamation of American neutrality in France’s latest war with England. At 50 years old, he was eager to return to his beloved Virginia estate to live as a gentleman farmer and philosopher. Adams, the vice president, refused to believe that his estranged friend was really done with public life. In letters to his two eldest sons, he sourly assessed the man he was convinced would challenge him to succeed Washington as president. On January 2 he wrote to Charles: Mr Jefferson is going to Montecello to Spend his Days in Retirement, in Rural Amusements and Philosophical Meditations—Untill the President dies or resigns, when I suppose he is to be invited from his Conversations with Egeria in the Groves, to take the Reins of the State, and conduct it forty Years in Piety and Peace. On January 3 he wrote to John Quincy at greater length, enumerating seven possible motives for Jefferson’s resignation. 5. Ambition is the Subtlest Beast of the Intellectual and Moral Field. It is wonderfully adroit in concealing itself from its owner, I had almost said from itself. Jefferson thinks he shall by this step get a Reputation of an humble, modest, meek Man, wholly without ambition or Vanity. He may even have deceived himself into this Belief. But if a Prospect opens, The World will see and he will feel, that he is as ambitious as Oliver Cromwell though no soldier. 6. At other Moments he may meditate the gratification of his Ambition; Numa was called from the Forrests to be King of Rome. And if Jefferson, after the Death or Resignation of the President should be summoned from the familiar Society of Egeria, to govern the Country forty Years in Peace and Piety, So be it. In the vernacular of the time, “conversation” was a synonym for sexual intercourse and “familiar” was a synonym for “intimate.” The obvious candidate for the person whose conversation and familiar society Jefferson would supposedly be enjoying at his bucolic home is Sally Hemings. But who was Egeria, and how confident can we be that Adams intended Hemings when he invoked her name? Egeria is a figure of some importance in the mythical early history of ancient Rome. According to Livy and Plutarch, after the death of the warlike Romulus, the senators invited a pious and intellectual Sabine named Numa Pompilius to become their king. Accepting the job with some reluctance, Numa set about establishing laws and a state religion. To persuade his unruly subjects that he had supernatural warrant for his innovations, Numa claimed that he was under the tutelage of Egeria, a divine nymph or goddess whom he would meet in a sacred grove. The stories say she was not just his instructor but also his spouse, his Sabine wife having died some years before. “Egeria is believed to have slept with Numa the just,” Ovid wrote in his Amores. Age 40 when he became king, Numa reigned for 43 years—a golden age of peace for Rome during which, in Livy’s words, “the neighboring peoples also, who had hitherto considered that it was no city but a bivouac that had been set up in their midst, as a menace to the general peace, came to feel such reverence for them, that they thought it sacrilege to injure a nation so wholly bent upon the worship of the gods.” Adams, who was well versed in Latin and Greek literature, had every reason to feel pleased with his comparison. Like Rome at the end of Romulus’ reign, the United States was a new nation getting ready for its second leader. Jefferson would be the American Numa, a philosophical successor to the military man who had won his country’s independence. Like Numa, Jefferson was a widower (his wife, Martha, died in 1782) who would prepare himself for the job by consorting with a nymph, his second wife, in a grove that was sacred to him. I asked Annette Gordon-Reed, the Harvard scholar and author of Thomas Jefferson and Sally Hemings: An American Controversy, what she made of the Adams references. “While the two letters to his sons do not definitively prove that Adams knew about the Jefferson-Hemings liaison in early 1794,” Gordon-Reed said in an email, “this elucidation of the allusion to Egeria makes that an intriguing possibility.” One didn’t require a classical education to grasp the Egeria allusion in the early 1790s. In 1786, the French writer Jean-Pierre Claris de Florian had published Numa Pompilius, Second Roi de Rome, a romantic novel dedicated to Marie Antoinette—she liked it—and intended as a guide for an enlightened monarchy in France. (“People will believe I’ve written the story / Of you, of Louis, and of the French,” Florian’s dedicatory poem declares.) Soon translated into English, Spanish and German, the novel became a runaway best seller in the North Atlantic world. It was while researching a novel of my own about the life and afterlife of Numa and Egeria that I happened upon the allusions in the two Adams letters. As a student of religion in public life, I have long been interested in Numa as an exemplary figure in the history of Western political thought from Cicero and St. Augustine to Machiavelli and Rousseau. In fact, John Adams had made a point of invoking Numa and his divine consort in the three-volume Defence of the Constitutions of Government of the United States of America, which he published while serving as minister to Eng­land in 1787. “It was the general opinion of ancient nations, that the divinity alone was adequate to the important office of giving laws to men,” he writes in the preface. “Among the Romans, Numa was indebted for those laws which procured the prosperity of his country to his conversations with Egeria.” Later in the work he explains, “Numa was chosen, a man of peace, piety, and humanity, who had address enough to make the nobles and people believe that he was married to the goddess Egeria, and received from his celestial consort all his laws and measures.” In the Defence, Adams was at pains to inform the world that, unlike other nations past and present, the recently united American states “have exhibited, perhaps, the first example of governments erected on the simple principles of nature.” In other words, no Egerias need apply: “It will never be pretended that any persons employed in that service had any interviews with the gods, or were in any degree under the inspiration of heaven, any more than those at work upon ships or houses, or labouring in merchandize or agriculture: it will for ever be acknowledged that these governments were contrived merely by the use of reason and the senses.” ********** Jefferson was the American avatar of Enlightenment rationality, a staunch opponent of the government establishment of religion, and the Washington administration’s foremost advocate of war with the Barbary pirates. Adams’ portrayal of him consulting with a goddess in order to govern “in Piety and Peace” was sharply pointed on all counts. But did he intend the goddess in question to refer to Sally Hemings? There’s good reason to think so. Seven years earlier, Jefferson had arranged for his 8-year-old daughter, Mary, to join him and his elder daughter, Martha, in Paris. Hemings, a slave who was also a half-sister of Jefferson’s late wife, accompanied Mary on the trans-Atlantic passage to England; upon their arrival, the two girls went to stay with the Adamses in London. Hemings was then 14 years old but, tellingly, Abigail Adams thought she was 15 or 16. Writing Jefferson that the two had arrived, Abigail Adams took them under her wing until an emissary showed up two weeks later to convey them to Paris, where Jefferson almost certainly began having sex with Hemings. So in 1787 John Adams had seen for himself that Jefferson had a nubile beauty in his possession. By the end of 1793, John Quincy and Charles presumably would have been aware of it, too. Otherwise, the sexual allusion to Egeria would have been lost on them. Significantly, John Adams did not allude to the matter when he wrote to Abigail at around the same time. She and Jefferson had something of a mutual admiration society, after all. “My Love to Thomas,” she wrote her husband on the very day that Jefferson resigned as secretary of state (though she wasn’t yet aware of that). Despite the two men’s political rivalry, she maintained a high regard for Jefferson through the 1790s, describing him as a man of “probity” in a letter to her sister. So while John Adams, in Philadelphia, did not refrain from criticizing Jefferson in his January 6, 1794, letter to Abigail, in Massachusetts, he did so with care. Jefferson went off Yesterday, and a good riddance of bad ware. I hope his Temper will be more cool and his Principles more reasonable in Retirement than they have been in office. I am almost tempted to wish he may be chosen Vice President at the next Election for there if he could do no good, he could do no harm. He has Talents I know, and Integrity I believe: but his mind is now poisoned with Passion Prejudice and Faction. There was no mention of Numa and Egeria. As I see it, John knew that his wife would not be amused by the insinuation that Jefferson was retiring to an intimate relationship with the maidservant she had cared for in London seven years earlier. That joke was reserved for the boys. A political eon passed between the vice president’s private joke and the presidential scandal. In 1796, Jefferson was narrowly defeated for the presidency by Adams and, under Article II of the Constitution (changed in 1804), indeed became vice president, having received the second-largest number of electoral votes. Four years later, he returned the favor, besting Adams in perhaps the ugliest presidential election in American history. By then, Callender had won his muckraking spurs by publishing the story of Alexander Hamilton’s affair with a married woman and subsequent illicit financial arrangement with the woman’s husband. Jefferson was sufficiently impressed to provide the journalist with financial support to keep up his anti-Federalist work. But in May of 1800, Callender was convicted and sentenced to nine months in prison under the Sedition Act for “The Prospect Before Us,” a tract alleging pervasive corruption in the Adams administration. After his release, he approached Jefferson and asked to be appointed postmaster of Richmond. Jefferson refused. Callender traveled to Charlottesville and ferreted out the Hemings story, published under the headline “The President, Again.” One of the more scurrilous commentaries on the story came from John Quincy Adams. On October 5, he sent his youngest brother, Thomas Boylston, a letter with an imitation of Horace’s famous ode to a friend who had fallen in love with his servant girl that begins: “Dear Thomas, deem it no disgrace / With slaves to mend thy breed / Nor let the wench’s smutty face / Deter thee from the deed.” In his letter John Quincy writes that he had been going through books of Horace to track down the context of a quotation when what should drop out but this poem by, of all people, Jefferson’s ideological comrade in arms Tom Paine, then living in France. John Quincy professed bafflement that “the tender tale of Sally” could have traveled across the Atlantic, and the poem back again, within just a few weeks. “But indeed,” he wrote, “Pain being so much in the philosopher’s confidence may have been acquainted with the facts earlier than the American public in general.” Historians have assumed that John Quincy, an amateur poet, composed the imitation ode in the weeks after Callender’s revelation hit the press. But in light of his father’s letters, it is not impossible that he had written it before, as his arch little story of its discovery implied. Thomas Boylston arranged to have his brother’s poem published in the prominent Federalist magazine The Port-Folio, where it did in fact appear under Paine’s name. The Adamses never dismissed Callender’s story as untrue. No direct comment from Abigail Adams has come to light, but Gordon-Reed argues in The Hemingses of Monticello that the scandal deepened her estrangement from Jefferson after the bitter 1800 election. When Mary Jefferson died in 1804, Abigail wrote Thomas a chilly condolence letter in which she described herself as one “who once took pleasure in subscribing herself your friend.” John Adams, in an 1810 letter to Joseph Ward, refers to James Callender in such a way as to imply that he did not consider the Hemings story credible. “Mr Jeffersons ‘Charities’ as he calls them to Callender, are a blot in his Escutchion,” he writes. “But I believe nothing that Callender Said, any more than if it had been Said by an infernal Spirit.” In the next paragraph, however, he appears more than prepared to suspend any such disbelief. Callender and Sally will be remembered as long as Jefferson as Blotts in his Character. The story of the latter, is a natural and almost unavoidable Consequence of that foul contagion (pox) in the human Character Negro Slavery. In the West Indies and the Southern States it has the Same Effect. A great Lady has Said She did not believe there was a Planter in Virginia who could not reckon among his Slaves a Number of his Children. But is it Sound Policy will it promote Morality, to keep up the Cry of such disgracefull Stories, now the Man is voluntarily retired from the World. The more the Subject is canvassed will not the horror of the Infamy be diminished? and this black Licentiousness be encouraged? Adams goes on to ask whether it will serve the public good to bring up the old story of Jefferson’s attempted seduction of a friend’s wife at the age of 25, “which is acknowledged to have happened.” His concern is not with the truth of such stories but with the desirability of continuing to harp on them (now that there is no political utility in doing so). He does not reject the idea that Jefferson behaved like other Virginia planters. ********** Adams’ sly joke in his 1794 letters shows him as less of a prude than is often thought. It also supports Callender’s assertion that the Jefferson-Hemings relationship was “well known,” but kept under wraps. It may be time to moderate the received view that journalism in the early republic was no-holds-barred. In reality, reporters did not rush into print with scandalous accusations of sexual misconduct by public figures. Compared with today’s partisan websites and social media, they were restrained. It took a James Callender to get the ball rolling. John Adams’ reference to Jefferson’s Egeria put him on the cusp of recognizing a new role for women in Western society. Thanks largely to Florian’s 1786 best seller, the female mentor of a politician, writer or artist came to be called his Egeria. That was the case with Napoleon, Beethoven, Mark Twain, Andrew Johnson and William Butler Yeats, to name a few. In Abigail, Adams had his own—though so far as I know she was never referred to as such. It was a halfway house on the road to women’s equality, an authoritative position for those whose social status was still subordinate. Gordon-Reed has criticized biographers who insist that it is “ridiculous even to consider the notion that Thomas Jefferson could ever have been under the positive influence of an insignificant black slave woman.” Ironically, Adams’ sarcastic allusion conjures up the possibility. Did Sally Hemings, Jefferson’s French-speaking bedmate and well-organized keeper of his private chambers, also serve as his guide and counselor—his Egeria? The question is, from the evidence we have, unanswerable. In the last book of his Metamorphoses, Ovid portrays Egeria as so inconsolable after the death of Numa that the goddess Diana turns her into a spring of running water. When Jefferson died in 1826, he and Hemings, like Numa and Egeria, had to all intents and purposes been married for four decades. Not long afterward, his daughter Martha freed Hemings from slavery, as her children had been freed before her. We do not know if, as she celebrated her liberation, she also mourned her loss. But we can be confident that her name, like Egeria’s, will forever be linked with her eminent spouse, as John Adams predicted. Mark Silk is a professor and the director of the Leonard E. Greenberg Center for the Study of Religion in Public Life at Trinity College. A former reporter and editorial writer at the Atlanta Journal-Constitution, he is the author of several books on religion in contemporary America and is a senior columnist for the Religion News Service.
7487bb7ee0f12406e918d4f726e6c8dc
https://www.smithsonianmag.com/history/john-browns-day-of-reckoning-139165084/
John Brown’s Day of Reckoning
John Brown’s Day of Reckoning Harpers Ferry, Virginia, lay sleeping on the night of October 16, 1859, as 19 heavily armed men stole down mist-shrouded bluffs along the Potomac River where it joins the Shenandoah. Their leader was a rail-thin 59-year-old man with a shock of graying hair and penetrating steel-gray eyes. His name was John Brown. Some of those who strode across a covered railway bridge from Maryland into Virginia were callow farm boys; others were seasoned veterans of the guerrilla war in disputed Kansas. Among them were Brown's youngest sons, Watson and Oliver; a fugitive slave from Charleston, South Carolina; an African-American student at Oberlin College; a pair of Quaker brothers from Iowa who had abandoned their pacifist beliefs to follow Brown; a former slave from Virginia; and men from Connecticut, New York, Pennsylvania and Indiana. They had come to Harpers Ferry to make war on slavery. [×] CLOSE Video: The Raid on Harpers Ferry The raid that Sunday night would be the most daring instance on record of white men entering a Southern state to incite a slave rebellion. In military terms, it was barely a skirmish, but the incident electrified the nation. It also created, in John Brown, a figure who after a century and a half remains one of the most emotive touchstones of our racial history, lionized by some Americans and loathed by others: few are indifferent. Brown's mantle has been claimed by figures as diverse as Malcolm X, Timothy McVeigh, Socialist leader Eugene Debs and abortion protesters espousing violence. "Americans do not deliberate about John Brown—they feel him," says Dennis Frye, the National Park Service's chief historian at Harpers Ferry. "He is still alive today in the American soul. He represents something for each of us, but none of us is in agreement about what he means." "The impact of Harpers Ferry quite literally transformed the nation," says Harvard historian John Stauffer, author of The Black Hearts of Men: Radical Abolitionists and the Transformation of Race. The tide of anger that flowed from Harpers Ferry traumatized Americans of all persuasions, terrorizing Southerners with the fear of massive slave rebellions, and radicalizing countless Northerners, who had hoped that violent confrontation over slavery could be indefinitely postponed. Before Harpers Ferry, leading politicians believed that the widening division between North and South would eventually yield to compromise. After it, the chasm appeared unbridgeable. Harpers Ferry splintered the Democratic Party, scrambled the leadership of the Republicans and produced the conditions that enabled Republican Abraham Lincoln to defeat two Democrats and a third-party candidate in the presidential election of 1860. "Had John Brown's raid not occurred, it is very possible that the 1860 election would have been a regular two-party contest between antislavery Republicans and pro-slavery Democrats," says City University of New York historian David Reynolds, author of John Brown: Abolitionist. "The Democrats would probably have won, since Lincoln received just 40 percent of the popular vote, around one million votes less than his three opponents." While the Democrats split over slavery, Republican candidates such as William Seward were tarnished by their association with abolitionists; Lincoln, at the time, was regarded as one of his party's more conservative options. "John Brown was, in effect, a hammer that shattered Lincoln's opponents into fragments," says Reynolds. "Because Brown helped to disrupt the party system, Lincoln was carried to victory, which in turn led 11 states to secede from the Union. This in turn led to the Civil War." Well into the 20th century, it was common to dismiss Brown as an irrational fanatic, or worse. In the rousing pro-Southern 1940 classic film Santa Fe Trail, actor Raymond Massey portrayed him as a wild-eyed madman. But the civil rights movement and a more thoughtful acknowledgment of the nation's racial problems have occasioned a more nuanced view. "Brown was thought mad because he crossed the line of permissible dissent," Stauffer says. "He was willing to sacrifice his life for the cause of blacks, and for this, in a culture that was simply marinated in racism, he was called mad." Brown was a hard man, to be sure, "built for times of trouble and fitted to grapple with the flintiest hardships," in the words of his close friend, the African-American orator Frederick Douglass. Brown felt a profound and lifelong empathy with the plight of slaves. "He stood apart from every other white in the historical record in his ability to burst free from the power of racism," says Stauffer. "Blacks were among his closest friends, and in some respects he felt more comfortable around blacks than he did around whites." Brown was born with the century, in 1800, in Connecticut, and raised by loving if strict parents who believed (as did many, if not most, in that era) that righteous punishment was an instrument of the divine. When he was a small boy, the Browns moved west in an ox-drawn wagon to the raw wilderness of frontier Ohio, settling in the town of Hudson, where they became known as friends to the rapidly diminishing population of Native Americans, and as abolitionists who were always ready to help fugitive slaves. Like many restless 19th-century Americans, Brown tried many professions, failing at some and succeeding modestly at others: farmer, tanner, surveyor, wool merchant. He married twice—his first wife died from illness—and, in all, fathered 20 children, almost half of whom died in infancy; 3 more would die in the war against slavery. Brown, whose beliefs were rooted in strict Calvinism, was convinced that he had been predestined to bring an end to slavery, which he believed with burning certitude was a sin against God. In his youth, both he and his father, Owen Brown, had served as "conductors" on the Underground Railroad. He had denounced racism within his own church, where African-Americans were required to sit in the back, and shocked neighbors by dining with blacks and addressing them as "Mr." and "Mrs." Douglass once described Brown as a man who "though a white gentleman, is in sympathy, a black man, and as deeply interested in our cause, as though his own soul had been pierced with the iron of slavery." In 1848, the wealthy abolitionist Gerrit Smith encouraged Brown and his family to live on land Smith had bestowed on black settlers in northern New York. Tucked away in the Adirondack Mountains, Brown concocted a plan to liberate slaves in numbers never before attempted: A "Subterranean Pass-Way"—the Underground Railroad writ large—would stretch south through the Allegheny and Appalachian mountains, linked by a chain of forts manned by armed abolitionists and free blacks. "These warriors would raid plantations and run fugitives north to Canada," says Stauffer. "The goal was to destroy the value of slave property." This scheme would form the template for the Harpers Ferry raid and, says Frye, under different circumstances "could have succeeded. [Brown] knew that he couldn't free four million people. But he understood economics and how much money was invested in slaves. There would be a panic—property values would dive. The slave economy would collapse." Political events of the 1850s turned Brown from a fierce, if essentially garden-variety, abolitionist into a man willing to take up arms, even die, for his cause. The Fugitive Slave Law of 1850, which imposed draconian penalties on anyone caught helping a runaway and required all citizens to cooperate in the capture of fugitive slaves, enraged Brown and other abolitionists. In 1854, another act of Congress pushed still more Northerners beyond their limits of tolerance. Under pressure from the South and its Democratic allies in the North, Congress opened the territories of Kansas and Nebraska to slavery under a concept called "popular sovereignty." The more northerly Nebraska was in little danger of becoming a slave state. Kansas, however, was up for grabs. Pro-slavery advocates—"the meanest and most desperate of men, armed to the teeth with Revolvers, Bowie Knives, Rifles & Cannon, while they are not only thoroughly organized, but under pay from Slaveholders," John Brown Jr. wrote to his father—poured into Kansas from Missouri. Antislavery settlers begged for guns and reinforcements. Among the thousands of abolitionists who left their farms, workshops or schools to respond to the call were John Brown and five of his sons. Brown himself arrived in Kansas in October 1855, driving a wagon loaded with rifles he had picked up in Ohio and Illinois, determined, he said, "to help defeat Satan and his legions." In May 1856, pro-slavery raiders sacked Lawrence, Kansas, in an orgy of burning and looting. Almost simultaneously, Brown learned that Charles Sumner of Massachusetts, the most outspoken abolitionist in the U.S. Senate, had been beaten senseless on the floor of the chamber by a cane-wielding congressman from South Carolina. Brown raged at the North's apparent helplessness. Advised to act with restraint, he retorted, "Caution, caution, sir. I am eternally tired of hearing the word caution. It is nothing but the word of cowardice." A party of Free-Staters led by Brown dragged five pro-slavery men out of their isolated cabins on eastern Kansas' Pottawatomie Creek and hacked them to death with cutlasses. The horrific nature of the murders disturbed even abolitionists. Brown was unrepentant. "God is my judge," he laconically replied when asked to account for his actions. Though he was a wanted man who hid out for a time, Brown eluded capture in the anarchic conditions that pervaded Kansas. Indeed, almost no one—pro-slavery or antislavery—was ever arraigned in a court for killings that took place during the guerrilla war there. The murders, however, ignited reprisals. Pro-slavery "border ruffians" raided Free- Staters' homesteads. Abolitionists fought back. Hamlets were burned, farms abandoned. Brown's son Frederick, who had participated in the Pottawatomie Creek massacre, was shot dead by a pro-slavery man. Although Brown survived many brushes with opponents, he seemed to sense his own fate. In August 1856 he told his son Jason, "I have only a short time to live—only one death to die, and I will die fighting for this cause." By almost any definition, the Pottawatomie killings were a terrorist act, intended to sow fear in slavery's defenders. "Brown viewed slavery as a state of war against blacks—a system of torture, rape, oppression and murder—and saw himself as a soldier in the army of the Lord against slavery," says Reynolds. "Kansas was Brown's trial by fire, his initiation into violence, his preparation for real war," he says. "By 1859, when he raided Harpers Ferry, Brown was ready, in his own words, ‘to take the war into Africa'—that is, into the South." In January 1858, Brown left Kansas to seek support for his planned Southern invasion. In April, he sought out a diminutive former slave, Harriet Tubman, who had made eight secret trips to Maryland's Eastern Shore to lead dozens of slaves north to freedom. Brown was so impressed that he began referring to her as "General Tubman." For her part, she embraced Brown as one of the few whites she had ever met who shared her belief that antislavery work was a life-and-death struggle. "Tubman thought Brown was the greatest white man who ever lived," says Kate Clifford Larson, author of Bound for the Promised Land: Harriet Tubman, Portrait of an American Hero. Having secured financial backing from wealthy abolitionists known as the "Secret Six," Brown returned to Kansas in mid-1858. In December, he led 12 fugitive slaves on an epic journey eastward, dodging pro-slavery guerrillas and marshals' posses and fighting and defeating a force of United States troops. Upon reaching Detroit, they were ferried across the Detroit River to Canada. Brown had covered nearly 1,500 miles in 82 days, proof to doubters, he felt sure, that he was capable of making the Subterranean Pass-Way a reality. With his "Secret Six" war chest, Brown purchased hundreds of Sharps carbines and thousands of pikes, with which he planned to arm the first wave of slaves he expected to flock to his banner once he occupied Harpers Ferry. Many thousands more could then be armed with rifles stored at the federal arsenal there. "When I strike, the bees will swarm," Brown assured Frederick Douglass, whom he urged to sign on as president of a "Provisional Government." Brown also expected Tubman to help him recruit young men for his revolutionary army, and, says Larson, "to help infiltrate the countryside before the raid, encourage local blacks to join Brown and when the time came, to be at his side—like a soldier." Ultimately, neither Tubman nor Douglass participated in the raid. Douglass was sure the venture would fail. He warned Brown that he was "going into a perfect steel trap, and that he would not get out alive." Tubman may have concluded that if Brown's plan failed, the Underground Railroad would be destroyed, its routes, methods and participants exposed. Sixty-one miles northwest of Washington, D.C., at the junction of the Potomac and Shenandoah rivers, Harpers Ferry was the site of a major federal armory, including a musket factory and rifle works, an arsenal, several large mills and an important railroad junction. "It was one of the most heavily industrialized towns south of the Mason-Dixon line," says Frye. "It was also a cosmopolitan town, with a lot of Irish and German immigrants, and even Yankees who worked in the industrial facilities." The town and its environs' population of 3,000 included about 300 African-Americans, evenly divided between slave and free. But more than 18,000 slaves—the "bees" Brown expected to swarm—lived in the surrounding counties. As his men stepped off the railway bridge into town that October night in 1859, Brown dispatched contingents to seize the musket factory, rifle works, arsenal and adjacent brick fire-engine house. (Three men remained in Maryland to guard weapons that Brown hoped to distribute to slaves who joined him.) "I want to free all the negroes in this state," he told one of his first hostages, a night watchman. "If the citizens interfere with me, I must only burn the town and have blood." Guards were posted at the bridges. Telegraph lines were cut. The railroad station was seized. It was there that the raid's first casualty occurred, when a porter, a free black man named Hayward Shepherd, challenged Brown's men and was shot dead in the dark. Once key locations had been secured, Brown sent a detachment to seize several prominent local slave owners, including Col. Lewis W. Washington, a great-grandnephew of the first president. Early reports claimed that Harpers Ferry had been taken by 50, then 150, then 200 white "insurrectionists" and "six hundred runaway negroes." Brown expected to have 1,500 men under his command by midday Monday. He later said he believed that he would eventually have armed as many as 5,000 slaves. But the bees did not swarm. (Only a handful of slaves lent Brown assistance.) Instead, as Brown's band watched dawn break over the craggy ridges enclosing Harpers Ferry, local white militias—similar to today's National Guard—were hastening to arms. First to arrive were the Jefferson Guards, from nearby Charles Town. Uniformed in blue, with tall black Mexican War-era shakos on their heads and brandishing .58-caliber rifles, they seized the railway bridge, killing a former slave named Dangerfield Newby and cutting Brown off from his route of escape. Newby had gone north in a failed attempt to earn enough money to buy freedom for his wife and six children. In his pocket was a letter from his wife: "It is said Master is in want of money," she had written. "I know not what time he may sell me, and then all my bright hopes of the future are blasted, for their [sic] has been one bright hope to cheer me in all my troubles, that is to be with you." As the day progressed, armed units poured in from Frederick, Maryland; Martinsburg and Shepherdstown, Virginia; and elsewhere. Brown and his raiders were soon surrounded. He and a dozen of his men held out in the engine house, a small but formidable brick building, with stout oak doors in front. Other small groups remained holed up in the musket factory and rifle works. Acknowledging their increasingly dire predicament, Brown sent out New Yorker William Thompson, bearing a white flag, to propose a cease-fire. But Thompson was captured and held in the Galt House, a local hotel. Brown then dispatched his son, Watson, 24, and ex-cavalryman Aaron Stevens, also under a white flag, but the militiamen shot them down in the street. Watson, although fatally wounded, managed to crawl back to the engine house. Stevens, shot four times, was arrested. When the militia stormed the rifle works, the three men inside dashed for the shallow Shenandoah, hoping to wade across. Two of them—John Kagi, vice president of Brown's provisional government, and Lewis Leary, an African-American—were shot dead in the water. The black Oberlin student, John Copeland, reached a rock in the middle of the river, where he threw down his gun and surrendered. Twenty-year-old William Leeman slipped out of the engine house, hoping to make contact with the three men Brown had left as backup in Maryland. Leeman plunged into the Potomac and swam for his life. Trapped on an islet, he was shot dead as he tried to surrender. Throughout the afternoon, bystanders took potshots at his body. Through loopholes—small openings through which guns could be fired—that they had drilled in the engine house's thick doors, Brown's men tried to pick off their attackers, without much success. One of their shots, however, killed the town's mayor, Fontaine Beckham, enraging the local citizenry. "The anger at that moment was uncontrollable," says Frye. "A tornado of rage swept over them." A vengeful mob pushed its way into the Galt House, where William Thompson was being held prisoner. They dragged him onto the railroad trestle, shot him in the head as he begged for his life and tossed him over the railing into the Potomac. By nightfall, conditions inside the engine house had grown desperate. Brown's men had not eaten for more than 24 hours. Only four remained unwounded. The bloody corpses of slain raiders, including Brown's 20-year-old son, Oliver, lay at their feet. They knew there was no hope of escape. Eleven white hostages and two or three of their slaves were pressed against the back wall, utterly terrified. Two pumpers and hose carts were pushed against the doors, to brace against an assault expected at any moment. Yet if Brown felt defeated, he didn't show it. As his son Watson writhed in agony, Brown told him to die "as becomes a man." Soon perhaps a thousand men—many uniformed and disciplined, others drunk and brandishing weapons from shotguns to old muskets—would fill the narrow lanes of Harpers Ferry, surrounding Brown's tiny band. President James Buchanan had dispatched a company of Marines from Washington, under the command of one of the Army's most promising officers: Lt. Col. Robert E. Lee. Himself a slave owner, Lee had only disdain for abolitionists, who "he believed were exacerbating tensions by agitating among slaves and angering masters," says Elizabeth Brown Pryor, author of Reading the Man: A Portrait of Robert E. Lee Through His Private Letters. "He held that although slavery was regrettable, it was an institution sanctioned by God and as such would disappear only when God ordained it." Dressed in civilian clothes, Lee reached Harpers Ferry around midnight. He gathered the 90 Marines behind a nearby warehouse and worked out a plan of attack. In the predawn darkness, Lee's aide, a flamboyant young cavalry lieutenant, boldly approached the engine house, carrying a white flag. He was met at the door by Brown, who asked that he and his men be allowed to retreat across the river to Maryland, where they would free their hostages. The soldier promised only that the raiders would be protected from the mob and put on trial. "Well, lieutenant, I see we can't agree," replied Brown. The lieutenant stepped aside, and with his hand gave a prearranged signal to attack. Brown could have shot him dead—"just as easily as I could kill a musquito," he recalled later. Had he done so, the course of the Civil War might have been different. The lieutenant was J.E.B. Stuart, who would go on to serve brilliantly as Lee's cavalry commander. Lee first sent several men crawling below the loopholes, to smash the door with sledgehammers. When that failed, a larger party charged the weakened door, using a ladder as a battering ram, punching through on their second try. Lt. Israel Green squirmed through the hole to find himself beneath one of the pumpers. According to Frye, as Green emerged into the darkened room, one of the hostages pointed at Brown. The abolitionist turned just as Green lunged forward with his saber, striking Brown in the gut with what should have been a death blow. Brown fell, stunned but astonishingly unharmed: the sword had struck a buckle and bent itself double. With the sword's hilt, Green then hammered Brown's skull until he passed out. Although severely injured, Brown would survive. "History may be a matter of a quarter of an inch," says Frye. "If the blade had struck a quarter inch to the left or right, up or down, Brown would have been a corpse, and there would have been no story for him to tell, and there would have been no martyr." Meanwhile, the Marines poured through the breach. Brown's men were overwhelmed. One Marine impaled Indianan Jeremiah Anderson against a wall. Another bayoneted young Dauphin Thompson, where he lay under a fire engine. It was over in less than three minutes. Of the 19 men who strode into Harpers Ferry less than 36 hours before, five were now prisoners; ten had been killed or fatally injured. Four townspeople had also died; more than a dozen militiamen were wounded. Only two of Brown's men escaped the siege. Amid the commotion, Osborne Anderson and Albert Hazlett slipped out the back of the armory, climbed a wall and scuttled behind the embankment of the Baltimore and Ohio Railroad to the bank of the Potomac, where they found a boat and paddled to the Maryland shore. Hazlett and another of the men whom Brown had left behind to guard supplies were later captured in Pennsylvania and extradited to Virginia. Of the total, five members of the raiding party would eventually make their way to safety in the North or Canada. Brown and his captured men were charged with treason, first-degree murder and "conspiring with Negroes to produce insurrection." All of the charges carried the death penalty. The trial, held in Charles Town, Virginia, began on October 26; the verdict was guilty, and Brown was sentenced on November 2. Brown met his death stoically on the morning of December 2, 1859. He was led out of the Charles Town jail, where he had been held since his capture, and seated on a small wagon carrying a white pine coffin. He handed a note to one of his guards: "I John Brown am now quite certain that the crimes of this guilty land: will never be purged away; but with blood." Escorted by six companies of infantry, he was transported to a scaffold where, at 11:15, a sack was placed over his head and a rope fitted around his neck. Brown told his guard, "Don't keep me waiting longer than necessary. Be quick." These were his last words. Among the witnesses to his death were Robert E. Lee and two other men whose lives would be irrevocably changed by the events at Harpers Ferry. One was a Presbyterian professor from the Virginia Military Institute, Thomas J. Jackson, who would earn the nickname "Stonewall" less than two years later at the Battle of Bull Run. The other was a young actor with seductive eyes and curly hair, already a fanatical believer in Southern nationalism: John Wilkes Booth. The remaining convicted raiders would be hanged, one by one. Brown's death stirred blood in the North and the South for opposing reasons. "We shall be a thousand times more Anti-Slavery than we ever dared to think of being before," proclaimed the Newburyport (Massachusetts) Herald. "Some eighteen hundred years ago Christ was crucified," Henry David Thoreau opined in a speech in Concord on the day of Brown's execution, "This morning, perchance, Captain Brown was hung. These are the two ends of a chain which is not without its links. He is not Old Brown any longer; he is an angel of light." In 1861, Yankee soldiers would march to battle singing: "John Brown's body lies a-mouldering in the grave, but his soul goes marching on." On the other side of the Mason-Dixon line, "this was the South's Pearl Harbor, its ground zero," says Frye. "There was a heightened sense of paranoia, a fear of more abolitionist attacks—that more Browns were coming any day, at any moment. The South's greatest fear was slave insurrection. They all knew that if you held four million people in bondage, you're vulnerable to attack." Militias sprang up across the South. In town after town, units organized, armed and drilled. When war broke out in 1861, they would provide the Confederacy with tens of thousands of well-trained soldiers. "In effect, 18 months before Fort Sumter, the South was already declaring war against the North," says Frye. "Brown gave them the unifying momentum they needed, a common cause based on preserving the chains of slavery." Fergus M. Bordewich, a frequent contributor of articles on history, is profiled in the "From the Editor" column.
447c89b6059904d3bdeb292f850abfe7
https://www.smithsonianmag.com/history/john-m-barry-on-roger-williams-and-the-indians-9322792/
John M. Barry on Roger Williams and the Indians
John M. Barry on Roger Williams and the Indians John M. Barry is the author of New York Times bestsellers The Great Influenza: The Epic Story of the Deadliest Plague in History and Rising Tide: The Great Mississippi Flood 1927 and How It Changed America. His most recent book, Roger Williams and the Creation of the American Soul explores the relation between church and state and between the individual and the state through the story of Roger Williams’ search for religious freedom and how it informed the society he founded in Rhode Island. Barry spoke to the magazine on Williams’ respectful relationship with American Indians. Roger Williams said the Indians helped him survive in the wilderness after his banishment from the Massachusetts Bay Colony. How did he come in contact with Indians after he arrived in America? Williams had a great facility with language—a great curiosity for language—and began trading with Indians and trying to learn their language. He arrived first in Massachusetts and then went to Plymouth for a couple of years. He clearly traded with the Indians when he was in Plymouth, and when he went back to Massachusetts, he continued trading with them. He also negotiated between the English and the Indians as well as between Indian tribes, chiefly the Narragansett and the Wampanoag. He was easily the most fluent Englishman in America in the Algonquin language, the language used by New England Indians. Then in 1636, five years after he arrived, he was banished, so he had had five years of contact with the Indians. How did William’s views on Indian land rights put him at odds with his fellow colonists? The colonists had two basic arguments for title to the land. First, the king gave it to them. Second, they argued that God had decided to give it to them by wiping out the Indian populations, probably with the smallpox epidemic. Since it was vacated, they felt it was theirs for the taking. Williams did not believe that. Running through Williams’ veins was this idea that English common law controlled all legal relationships and guaranteed individual rights. He believed that Indians had the same property rights as Englishmen, and therefore just because the crown gave an Englishman land didn’t mean it had any legal authority. As far as the vacancy argument, he pointed out that English noblemen owned vast estates and their only use of it was for hunting—same as the Indians. He felt the only legal claim to Indian land came when an Englishman bought the land from the Indians, so this was a threat to the English’s legal title in the Bay Colony. Many people in Massachusetts had already bought some or all their land from the Indians, and after Williams started talking, many retroactively bought pretty much all the land they had. To make sure they had secure title, they tracked down Indians who could claim land they were occupying and paid them small amounts. That wasn’t universal, but it was widespread. Despite Williams’ banishment from Massachusetts, the Bay Colony asks him to persuade the Narragansett to side with the English in the Pequot War of 1637. Why does Williams’ oblige and how does he get the Narragansett to agree? There was a real threat to the very survival of the English in 1637 if the Pequot and the Narragansett joined forces in an alliance and attacked the English. Williams very much felt he was an Englishman despite having been banished. Also, he had a very close relationship with John Winthrop, who was then deputy governor of the Massachusetts Bay Colony and who had earlier warned Williams that he was about to be arrested, giving him the opportunity to flee.  He had an equally strong relationship with Henry Vane, the governor at the time. Partly out of personal loyalty to Winthrop and Vane, partly out of loyalty to fellow countrymen, he acted. He risked his life when he walked into the camp where the Pequot and Narragansett were negotiating. As the only European in a camp of probably 1,000 or so warriors and several thousand more Indians, he proceeded to confront the Pequot, contradict them, and convince the Narragansett to remain neutral in the war. That certainly saved many English lives. It probably saved the colony itself, although even had the English been driven into the sea, they certainly would have returned. Williams’ book A Key into the Language of America is more than just a dictionary, providing insights into Narragansett culture. What were some of his observations? He concluded that there were no real differences between Indians and Englishmen as men. There were only cultural and religious differences. He believed what he wrote: “Boast not proud English, of they birth & blood, Thy brother Indian is by birth as Good. Of one blood God made him, and thee, & all.” Williams also made anthropological observations: such as how Indians viewed borders; how they viewed property; that family kinship was extremely important—so much so that if an Indian was accused of murder and fled, the tribe might execute his brother instead; the way they prepared food; their lifestyle. All these things are described in the book. Why didn’t Williams try to convert the Indians? He believed that to truly become a Christian you had to understand in depth what Christianity was and what the message of Christ was. He felt confident that he could have brought the tribes to a pro forma profession of Christianity, but that was not satisfactory to him. Williams felt that becoming a Christian had to come not simply from the heart, but from the heart and a full intellectual understanding. As fluent as he was in their language, he did not feel that he had enough fluency to really communicate that. As devout as he was, when Massachusetts Christians were putting intense pressure on the Narragansett to convert, threatening them with armed action if they did not, he actually convinced Cromwell’s government to tell Massachusetts to back off, to guarantee that the Narragansett had the right to worship as they chose, which is really kind of extraordinary. In 1675, hostilities between the colonists and the Indians break out and again Williams’ mediates between the parties, but he’s unsuccessful. Does the King Philip’s War change Williams’ relationship with the Indians? The Indians burned Providence and burned Williams’ own house down, which meant that he spent his last years in poverty. Nonetheless, right up to the very end of his life, he still considered Indians his friends. I think he saw the war not as this racial Armageddon but as bad policy, a terrible mistake. Certainly, Europeans had been on different sides in different conflicts and then formed alliances and friendships. He was well aware of that. I think he viewed it in that context.
ce6b9e49071eae7d6ce2d6921960c055
https://www.smithsonianmag.com/history/juanita-moody-woman-helped-avert-nuclear-war-180976993/
On the morning of Sunday, October 14, 1962, Juanita Moody exited the headquarters of the National Security Agency, at Fort Meade, Maryland, and walked the short distance to her car, parked in one of the front-row spaces reserved for top leadership. The sky was a crystalline blue, “a most beautiful day,” she recalled later. Moody had just learned that the U.S. Air Force was sending a U-2 spy plane over Cuba to take high-altitude photographs of military installations across the island. Moody was worried for the pilot—twice already in the past two years a U-2 spy plane had been shot out of the sky, once over the Soviet Union and once over China. She was also worried for the country. Tensions between the United States and the Soviet Union were worsening by the day. President John F. Kennedy, American military leaders and the intelligence community believed that the Soviet military was up to something in Cuba. Exactly what, no one could say. “I went out and got into my old convertible at the precise moment I had been told this pilot was going to get into his plane,” Moody said. What unfolded over the next two weeks was arguably the most dangerous period in the history of civilization. Close to 60 years later, the Cuban Missile Crisis is still considered a nearly catastrophic failure on the part of America’s national security apparatus. How America’s top agents, soldiers, diplomats, intelligence analysts and elected officials failed to anticipate and uncover the buildup of a nuclear arsenal on America’s doorstep, less than 100 miles off the coast, is still being studied and debated. At best, the story of American intelligence activities before and during the crisis is far from complete. One of the most extraordinary omissions to date is the central role played by Moody, a 38-year-old code-breaking whiz and the head of the NSA’s Cuba desk during the perilous fall of 1962. Even today her name is largely unknown outside the agency, and the details of her contributions to the nation’s security remain closely guarded. Of medium height, with lightly curled brown hair and a round face, Moody was not a spy in the secret agent sense. Her world was signals intelligence, or “sigint”—radio messages, radar data, electronic communications, weapons systems readings, shipping manifests and anything else that could be surreptitiously intercepted from friends and foes alike. Her only brief turn in the spotlight came more than a decade after the Cuban Missile Crisis, when she found herself caught up in the domestic surveillance scandals that engulfed Washington after Watergate. But who was this woman? I’ve spent several years trying to find out, digging through government archives and reviewing formerly classified documents, including internal NSA reports and performance reviews obtained using the Freedom of Information Act, as well as interviewing historians, current and former NSA staff and Moody’s surviving relatives, who provided personal letters and photographs. Now the story of this spy service pioneer and key figure in the nation’s response to Soviet encroachment in the Western Hemisphere can be told for the first time. * * * Juanita Moody (Née morris) was born on May 29, 1924, the first of nine children. Her father, Joseph, was a railroad worker turned cotton-and-soybean farmer, and her mother, Mary Elizabeth, a homemaker. The family lived in the hamlet of Morven, North Carolina, in a rented house with no bathroom, no electricity and no running water. Moody was a leader from an early age. “I felt I had to do what Juanita said,” her sister Virginia “Dare” Marsh, 90, told me on a call last spring. To her siblings, Juanita’s authority was on a par with that of their parents, yet her brothers and sisters didn’t resent her. “She was always sweet lovin’ and fair to me,” Marsh said. There was also a sense that Juanita was special. “I felt at times like my parents looked up to her as well.” The school superintendent in Morven saw a spark in her, too, and recommended her for Western Carolina Teachers College, in Cullowhee. This article is a selection from the March issue of Smithsonian magazine Juanita borrowed money and enrolled, but then came the war. “All of the sudden there were practically no men left on the campus,” Moody recalled later, in one of a series of interviews with NSA historians that were declassified in 2016. “I felt that it was wrong to be spending my time in this beautiful place—clear blue skies, going around campus and studying and going to classes at leisure, when my country was in a war.” At the Army recruiting office in Charlotte, she said she wanted to volunteer. “What do you want to do?” the recruiter asked. “I’d like to get into intelligence work,” she said. It was spring 1943. Moody took a few tests and was sent to Arlington Hall, in Virginia, headquarters of the Signal Intelligence Service, the precursor to the NSA. She was trained quickly in what was known as “cryptanalysis,” and was soon part of a group that used ciphers to crack encrypted Nazi communications. When she finished work for the day, she and a few other obsessives stayed late into the night, working illicitly on an unsolved “one-time pad,” a code that could only be cracked with a key provided to the message’s recipient ahead of time. She recalled working “every waking moment” and subsisting on buns made by a sympathetic local baker who left them for her to pick up on her way home in the middle of the night. The painstaking nature of code breaking in those days, when teams of analysts sifted through piles of intercepted texts and tabulated and computed possible interpretations using pencil and paper, made a deep impression on Moody. Eventually, she and a colleague, a linguist and mathematician who had worked at Bletchley Park, Britain’s code-breaking headquarters, persuaded agency engineers to custom-build a machine for the one-time pad problem based on Alan Turing’s work that could generate cipher keys automatically, using the agents’ inputs. “It was a very clumsy thing,” Moody recalled. But it worked, helping the Americans decode secret messages sent to Berlin from the German ambassador in Tokyo. It was the first of many times in her long career that Moody, who would herself become a familiar face at Bletchley Park and at the IBM campus in New York, helped advance intelligence work by pushing for an ambitious and innovative use of new technologies. After Japan’s surrender, Moody told her superior at the SIS that, with the war done, she planned to return to college. Although he himself had earned a PhD, he told her that she was making a big mistake. “This is your cup of tea, and there are going to be other targets”—other secrets to uncover in defense of the nation. “This effort is not going to stop today. This is just the beginning.” * * * Moody stayed with the SIS, as a staff cryptanalyst focused on signals collection in Eastern Europe. In 1947, she was promoted to chief of the Yugoslavia section. Five years later, on October 24, 1952, President Harry Truman signed a secret memorandum, and the National Security Agency was born. Since the NSA’s inception, its role was unambiguous: snoop, scoop, filter, deliver. The agency’s responsibility ended at gathering information. Analysis was the purview of the brains at CIA. During the 1950s, Moody took on several new leadership roles at the NSA—chief of European satellites, chief of Russian manual systems, chief of Russian and East European high-grade manual systems. She also fretted over technical inefficiencies. At a time when computing technology was advancing quickly, she viewed the NSA’s use of handwritten decryptions, memos and top-secret communications as anachronistic. Where she excelled was not high-level mathematics or engineering but the application of new technologies to distill huge amounts of data and make it available to decision makers as quickly as possible. She was an advocate for using big data long before the concept had taken hold, and she pushed the agency to adopt the latest tools—Teletype, Flexowriter, early IBM computers, an intranet precursor and a searchable database called Solis. She managed whole teams of people—her “troops,” as she called them. As a leader, she was impolitic by her own measure, occasionally calling meetings to order by whacking a hockey stick on the table. She established a system she called “Show and Tell.” Each morning, while she sipped her coffee, the division heads under her command would come by her office one by one to present highlights from the previous day’s intelligence haul. Moody would then grill them about when the intercepts were made and when the information had been sent to the NSA’s “customers”—the White House, congressional leadership, military brass, the other intelligence agencies. When she judged the lag time to be substantial, she said so. “You people are doing a tremendous job producing beautiful history,” she’d tell them. “You’re not producing intelligence.” When it came to being a woman in a male-dominated world, Moody had a simple outlook. “I never had much of a problem,” she told an NSA historian in 2001. She credited the men in her family for bringing her up not to question her own worth. “They always made me feel that I could conquer the world if I wanted to,” she said. At the same time, she was convinced that on more than one occasion she had been passed over for a promotion because she was a woman. As the only woman present at NSA stag parties she was treated like a spectacle—one time the men had fed her with a spoon—yet she would only say, “That stood out a little bit.” She was also aware of harassment. One NSA director (Moody wouldn’t name him) employed several young women in the offices in Fort Meade, whom the director, believing himself to be witty, called NSA’s “paint and body shop.” Moody ran into three of these women one time in the restroom. Through tears, they described what they’d been subjected to, which Moody did not specify, but which appears to have been inappropriate sexual comments or behavior, perhaps even solicitation. Moody chose not to do or say anything. “Until this day,” she told the NSA interviewer, “I wish I had done something, you know—but I didn’t.” When she wasn’t working, Moody and her husband, Warren, an executive at Eastern Airlines, would escape the Beltway for the Shenandoah Valley, where they had a mountain cabin nicknamed Hoot ’n Holler. Life away from Washington was about cocktails, lawn games, music, tracking turkeys—anything but national security. Officials from Washington, friends from around the globe, military generals, even the occasional MI6 agent were guests. Moody’s favorite pastimes were listening to jazz, working in the garden, fishing, and hunting deer with a Ruger .44-caliber carbine. “She’d be singing Roger Miller songs and had a drink and was all happy,” Moody’s nephew William Peter Jacobsen III told me. In 1961, having been attached to the so-called “Soviet problem” for several years, Moody moved up again, becoming chief of a section known as G-Group, which was responsible for overseeing NSA’s operations nearly everywhere excluding China and the Soviet Union—some 120 countries. On the way home the night of her promotion, she stopped at a store and bought maps of Africa and South America. She wanted “to learn what all the countries were,” she recalled. * * * On April 17, 1961, paramilitary soldiers stormed Cuba’s Playa Girón, launching the brief and doomed attempt to overthrow Fidel Castro that became known as the Bay of Pigs. The surprise attack, carried out by Cuban exiles trained and led by the CIA, was in disarray almost from the start, and the blundering operation set in motion a rapid escalation between the United States and the Soviet Union that led directly to the Cuban Missile Crisis. Before the Bay of Pigs, Castro had been lukewarm about Soviet overtures and support. When the superpower next door tried to oust him, he changed his mind. For those in the American intelligence community, Soviet Premier Nikita Khrushchev’s vow to help the Cubans defend themselves made it imperative to focus more attention on the Caribbean, a new front in the Cold War. That spring, the NSA reorganized its operations, shifting resources to Cuba, which fell squarely under Moody’s command. “There might have been the equivalent of two people on the problem at that point,” Moody recalled. One of the first things her team detected was Cuba’s improved communication security, which had until then been “relatively unsophisticated,” as Moody put it. Now it was strengthened with the introduction of a microwave system across the whole island. The technology provided a high level of secrecy because land-based microwave antennas relay information in a chain, and the only way to intercept a message was to be close to an antenna. U.S. military and intelligence agencies knew about the towers but couldn’t intercept the signals being transmitted. The NSA responded by establishing new intercept facilities in Florida and flying surveillance aircraft around Cuba. But that wasn’t enough, so the Navy deployed the Oxford, the Liberty and the Belmont—World War II-era ships newly outfitted with surveillance equipment—which sailed along the edge of the island’s territorial waters. Over the next few months, Moody’s team discovered that the microwave towers were the least of America’s worries. Sigint revealed increased maritime traffic from Soviet naval bases to Cuba. Cargo manifests intercepted from Soviet ships docking in Cuba were sometimes blank. Other times, declared cargo didn’t match weights reported in port. Through intercepted conversations, the NSA learned of clandestine unloading at night, as well as the delivery of Soviet tanks. Things “were getting hotter and hotter,” Moody recalled. Around this same time, intercepted communications in Europe contained Spanish-language chatter at air bases in Czechoslovakia: The Soviets were training Cuban pilots. Also, the Americans learned, the USSR was sending MIG jets and IL-28 bombers to Cuba. Moody traveled to London at least once during this period, most likely to coordinate with her counterparts at Britain’s Government Communications Headquarters. By the fall of 1961, the Soviets had backed out of a bilateral moratorium on nuclear-weapons testing; in late October, they detonated a 50-megaton hydrogen bomb in the Arctic Sea, producing a blast equivalent to 3,800 Hiroshima bombs. A few weeks later, Louis Tordella, deputy director at the NSA, showed up at Moody’s office with two high-ranking officials from the Kennedy administration, one of whom was Edward Lansdale, an assistant secretary of defense. They stepped into a small conference room, where Tordella closed the door and drew the blinds. “We want to know what you know about Cuba,” Moody recalled Lansdale telling her. “Even if it’s a hunch, or a thought, or a guess, I want to know everything that’s on your mind when you think Cuba.” Moody started in on a highlight reel of intercepts—the blank cargo manifests, the bogus port declarations, conversations that mentioned tanks, radar and antiaircraft guns, the Soviet money and personnel flowing to the island. At one point, Lansdale interjected, “Now, come on!” as if Moody was exaggerating. She was unfazed. “I don’t have to have any hunches,” she said. It was all in the sigint. Impressed by her expertise, alarmed by what she had to say, and perhaps concerned that no one was providing the White House with this level of detail about an aggressive military buildup in Cuba, Lansdale asked Moody to write up her findings. Along with a few colleagues, she spent the next three days and nights compiling “wheelbarrow loads of material” into what she called “a special little summary for the assistant secretary of defense.” When she was done, Moody urged Tordella to “publish” her report, meaning circulate it among the intelligence agencies, the White House, the State Department and the military. Cautious not to step outside NSA’s prescribed role, Tordella rebuffed her, but he did send it to Lansdale, who sent it to President Kennedy, who returned it with his initials—signaling he’d read it. “I told my troops, ‘Keep this updated,’” Moody said of her report. “‘If you get anything to add to it, do it immediately and tell me.’” Over the next few months, Moody repeatedly, and unsuccessfully, pleaded with Tordella to release her updated report. By early 1962, she said she was “really getting scared.” The amount of military equipment piling up in Cuba didn’t square with the Soviets’ repeated assertions that it was all “defensive.” Details about Soviet technicians “moving around in Cuba” were especially worrisome, and by this point the NSA likely knew the Soviets had moved surface-to-air missiles (not to be confused with ballistic nuclear missiles) to Cuba as well. In February, not long after the NSA learned that a general from the USSR’s Strategic Rocket Forces arrived in Cuba, Moody went to Tordella once more. “Look, let’s publish this,” she said. “We can’t do that,” Tordella replied. “It will get us in trouble, because it would be considered outside of our charter.” It was the same rationale he’d been giving since November. Moody persisted. “It has reached the point,” she told him, “that I am more worried about the trouble we’re going to get in having not published it, because someday we’re going to have to answer for this. And if we do....” Tordella relented. It was the first such NSA report distributed to the wider intelligence community, and it quickly made the rounds. Before long, an old CIA friend of Moody’s showed up at her office. He wanted to congratulate her, he said. “Everybody knows that you were responsible for getting that serialized report on what’s happening in Cuba out, and I want you to know that was a good thing you did,” she recalled him saying. But he also warned her that not everyone was thrilled about her initiative; he had just come from a high-level meeting at the CIA during which officials tried to “decide what to do about NSA for overstepping their bounds.” Even today, in spite of the fact that so much about the Cuban Missile Crisis has been made public, Moody’s groundbreaking report, dated February 1962, remains classified. Nevertheless, it’s possible to track the crucial impact it had on American decision-making as the Cuba situation pushed closer to disaster. By springtime, it was clear that the Cubans had established an air defense system similar to one in the Soviet Union and manned, at least in part, by native Russian speakers. In a little over a month, the NSA and its partners had tracked 57 shipments of personnel and military equipment from the USSR to Cuba. MIG fighter jets were soon buzzing U.S. naval aircraft venturing near the island. The CIA, meanwhile, was hearing from spies and double agents about missiles, but what kind of missiles was still unknown. In an August 22 meeting, CIA Director John McCone updated President Kennedy about Soviet ships that had recently delivered thousands of Russian troops plus “substantial quantities of military materiel as well as special electronic equipment, many large cases, which might contain fusillade for fighter airplanes or it might contain missile parts, we do not know.” What he did know came, at least in part, from sigint reports by Moody and her team. This was two months before the apex of the crisis. If anyone was worrying about the possible presence of nuclear missiles specifically, they didn’t say so. But McCone was closest to guessing the nature of the threat. The CIA director grew convinced that the Soviets had placed surface-to-air missiles on the island to keep prying eyes away. His deputy at the time later recalled McCone telling his team: “They’re preventing intrusion to protect something. Now what the hell is it?” The Americans stopped conducting U-2 reconnaissance flights over Cuba in early September out of concern that the planes might be shot down. Later that month, armed with intelligence from Moody’s G-Group and information from sources on the ground, McCone persuaded the president and the National Security Council to restart U-2 flyover missions to get answers. Poor weather and bureaucratic holdups delayed the first mission. Finally, on Sunday, October 14, after a so-called “photo gap” of more than five weeks, a U-2 spy plane took off from California’s Edwards Air Force Base for the five-hour flight to Cuba. That same morning, Moody sat in her convertible at Fort Meade, staring at the sky. * * * Because of the danger, the pilot spent only a few short minutes in Cuban airspace before landing in Florida. The next day, a group of intelligence experts huddled over tables in the Steuart Building in downtown Washington, D.C., the secret headquarters of the CIA’s National Photographic Interpretation Center, to pore over 928 images that the U-2 had taken of several military sites. Examining one set of photographs, an analyst named Vince Direnzo paused when he saw what appeared to be six unusually long objects obscured by a covering, possibly canvas. He determined that these objects were much larger than Soviet surface-to-air missiles the Americans already knew were in Cuba. Direnzo checked photographs of the same site taken during flyover missions weeks earlier and saw that the objects had been placed there in the intervening time. In the archives he compared the images with photographs of May Day celebrations in Moscow, when the Soviets paraded military equipment through Red Square. He became convinced that the objects spotted in Cuba were SS-4 medium-range ballistic missiles, weapons that could carry nuclear payloads and had a range of more than 1,200 miles—capable of striking a large portion of the continental United States. Further photographic evidence from other sites revealed missiles with a range of 2,400 miles. Direnzo and his colleagues spent hours checking and rechecking their measurements and looking for ways they might be wrong. When they shared their assessment with the center’s director, he concurred, adding that this was most likely “the biggest story of our time.” The findings were soon verified by a Soviet colonel secretly working for MI6 and the CIA. Faced suddenly with an unprecedented threat, Kennedy ordered a maritime “quarantine” of Cuba, to block any further transport of weapons to the island, and declared that noncompliance by the Soviet Union would mean war. The hope was the line-in-the-sea strategy would demonstrate force and readiness to attack while providing both sides with breathing room, so they could begin inching away from the ledge. With the discovery of nuclear weapons in Cuba, the mission at the NSA shifted abruptly from uncovering secrets to assessing the enemy’s war footing in real time or as close to it as possible. Gordon Blake, the NSA director, established an around-the-clock team to churn out sigint summaries twice a day as well as immediate updates as needed. Moody was put in charge of this effort; she spent many nights sleeping on a cot in her office. She later recalled the solidarity throughout the agency, with staff members from other groups showing up at Moody’s office to volunteer their help. Late one night, Blake himself stopped by and asked how he could lend a hand. Moody gave him a list of names. Blake picked up the phone, and Moody overheard him rousing people from their sleep: “This is Gordon Blake. I’m calling for Juanita Moody. She wonders if you can come in. They need you.” Listening and watching for new activity on and near the island, sigint collectors relied on land-based electronic surveillance, a “net” of underwater hydrophones, spy planes, listening devices on Navy ships, and other, still-classified tools. The USS Oxford continued its near-shore mission, despite being well within range of a Soviet attack. It wasn’t long before sigint indicated that radar systems at the newly discovered missile sites had been activated. Of paramount concern was figuring out how Soviet ships would respond to the quarantine. Using intercepted radio and radar information, maritime traffic analyses and location data provided by the Navy, Moody’s team kept close tabs on Soviet ships and nuclear-armed submarines as they made their way from the North Atlantic toward Cuba. One critical intercepted correspondence, from the Soviet naval station at Odessa, informed all Soviet ships that their orders would now come directly from Moscow. But whether this meant Moscow was planning a coordinated challenge to the blockade, or a standdown, no one knew. Then, on October 24, two days after Kennedy announced the quarantine, there was a glimmer of hope: Sigint confirmed that at least one Soviet ship headed toward Cuba had stopped and changed direction, and appeared to be rerouting back toward the Soviet Union—a sign the Soviets weren’t intending to challenge Kennedy’s quarantine. Yet it was also crucial that American officials feel confident in that assessment. This close to the ledge, there was simply no room for miscalculation. Nobody understood that better than Moody. Although the intelligence about the ship redirecting its course came in the middle of the night, Moody felt the higher-ups needed to know about it right away. She made an urgent call to Adlai Stevenson, the U.S. ambassador to the United Nations, who was slated to address the Security Council about the crisis the following day. When State Department officials refused to put her through, she dialed the number for his hotel room directly. “I called New York and got him out of bed,” she recalled. “I did what I felt was right, and I really didn’t care about the politics.” (She also noted that later “he sent up congratulations to the agency.”) The intelligence provided the first positive signs of a peaceful exit from the standoff, but it was hardly over. At one point, Navy destroyers and the aircraft carrier USS Randolph tried to force a nuclear-armed Soviet submarine just outside the quarantine zone to the surface by detonating underwater explosives, nearly provoking all-out war. Then, on October 27, the Soviets shot down a U-2 plane over Cuba, killing Air Force pilot Rudolf Anderson Jr. In Washington, the plan had been to strike back in the event that a U-2 was downed, but Kennedy ultimately decided to refrain. Finally, on the morning of October 28, after the United States secretly offered to remove its nuclear missile bases in Turkey and Italy, Khrushchev agreed to dismantle the missile sites in Cuba. A few weeks later, in a letter of thanks addressed to the NSA director, the commander of the U.S. Atlantic Fleet, Adm. Robert Dennison, wrote that the intelligence coming from NSA’s Cuba desk was “one of the most important single factors in supporting our operations and improving our readiness.” Moody’s use during the crisis of what were known as “electrograms,” essentially top-secret intelligence reports sent to the highest levels via Teletype, forever reshaped how the agency handled urgent intelligence, according to David Hatch, the senior NSA historian. “Juanita was a pioneer in using this capability,” he told me. Before Moody’s innovation, he went on, “most product was released via slower means, even in a crisis—hand-carried by courier, by interoffice mail, or even snail mail, to cite a few examples. The importance of having the ability to disseminate sigint in near-real-time was clearly demonstrated” during the Cuban Missile Crisis. “The information Juanita and her team produced was very important in the decision to launch U-2s,” Hatch said. The United States would not have learned what it did, when it did, about offensive nuclear weapons in Cuba without Moody, a civilian woman in a male and military-dominated agency. Moody would later say the work she did in the 1940s and ’50s had prepared her for the Cuba standoff. “I felt at the time, while it was happening, that somehow I had spent all of my career getting ready for that crisis,” she said of those tense weeks in the autumn of 1962. “Somehow, everything that I had done had helped point me to be in the best position possible, knowledge-wise, to know how to proceed in that crisis.” * * * Moody would go on to lead management training courses within the agency, and she helped establish a permanent position for an NSA liaison in the White House Situation Room. The deaths of U-2 pilots had troubled her deeply, and she worked to improve the system for warning pilots when enemy aircraft made threatening course corrections. And she continued to work closely with IBM engineers to improve the NSA’s technical capabilities. Within the agency, she reached legendary status. One of her Fort Meade colleagues told me that a gaggle of young staffers, nearly all of them men, could frequently be seen trailing Moody down the halls, scribbling notes while she spoke. In 1971, Moody received the Federal Woman’s Award, established to honor “leadership, judgment, integrity, and dedication” among female government employees. During the Cuba “emergency,” Moody’s citation noted, “when the provision of intelligence to the highest authorities was of utmost importance, Ms. Moody displayed extraordinary executive talent.” In his nomination letter, Tordella, the deputy NSA director, whom Moody had clashed with about the Cuba report, called her “brilliant,” and wrote that “no one in a position to know can but affirm that so far as this Agency contributed to the successful U.S. effort in a critical period, Mrs. Moody must be given credit for a significant share in that success.” At the banquet dinner, Moody, dressed in a pink gown, sat next to Henry Kissinger, then the U.S. national security adviser. She brought her parents from North Carolina, as well as her sister Dare. Afterward, congratulatory letters and cables came from the White House, the British Embassy, the U.S. Mission in Vietnam, the CIA, the Navy. Yet the broader American public, at that point unaware even of the existence of the National Security Agency, had no idea who she was. That changed in 1975, when a bipartisan congressional investigation launched in the wake of Watergate found that the NSA had intercepted conversations that included U.S. citizens. More than that, the NSA was supporting federal agencies, namely the CIA, FBI and Secret Service, in their efforts to surveil American citizens put on secret watch lists. An outcry ensued. The maelstrom would cause lasting damage to the American people’s perception of the trustworthiness of the country’s national security apparatus. Moody, as the liaison between the NSA and other federal agencies—memos to the NSA from FBI Director J. Edgar Hoover were addressed “Attention: Mrs. Juanita M. Moody”—was caught in the middle. In September 1975, NSA Director Lew Allen Jr. sent Moody to Capitol Hill to testify in hearings about the agency’s surveillance. She had never been trained to testify or speak to a general audience about NSA work, but she accepted the assignment without protest. Frank Church, the Idaho senator who chaired the committee investigating abuses of power by U.S. intelligence agencies, told Moody that she would have to testify in an open and televised session. Moody refused. “I took an oath to protect classified information and never to reveal it to those who are not authorized and have the need to know,” she told him. “I don’t know of any law that would require me to take an oath to break an oath. Is there such a thing, Senator?” There was not, and it was closed sessions for her week on Capitol Hill. At one point, Senator Walter Mondale, of Minnesota, demanded that Moody bring “everything” NSA had—meaning all the material gathered that might relate to American citizens. Practically speaking, it was an absurd demand; NSA was already collecting enormous amounts of information, much of it superfluous. Very little of it would be of value to the committee’s investigation. Moody tried to explain to Mondale that he misunderstood the nature of the information he was requesting, but he cut her off. “I don’t give a good goddamn about you and your computers, Mrs. Moody,” Mondale barked. “You just bring the material in here tomorrow.” The next day a truck dumped hundreds of pounds of paper at Mondale’s office. Mondale, having learned in a hurry how ill-informed his request had been, tried to make nice with Moody the next time they met. Putting his hand on her shoulder, he thanked her for being so cooperative. “I wasn’t too pleased or happy about that,” she said later, referring to Mondale’s hand on her shoulder, his change in tone, or both. During her testimony, Moody explained that lists of names were given to her group at the NSA. When the names appeared in their intercepts, NSA flagged it. She maintained to the last that the NSA had never done anything wrong. “We never targeted Americans,” she told an NSA interviewer in 2003. “We targeted foreign communications.” NSA’s own tribute to Moody in the agency’s “Hall of Honor” says the congressional hearings “incorrectly identified [her] with some possible abuses of government power.” Still, Moody kept cool throughout the hearings. She even savored the opportunity to teach committee members about the sigint process. She considered it “a great privilege” to help educate the men down on Capitol Hill. “It was the only thing I enjoyed down there,” she said. Two months later, in February 1976, Juanita Moody retired. If she was ever upset about the way she had been treated during the wiretapping scandal, she kept it to herself. She and Warren made frequent trips to Hoot ’n Holler, their Shenandoah getaway, and to North Carolina, where Moody’s parents and many siblings still lived. “All the years I was working, my sisters and brothers were the ones who took care of my parents,” she told a friend. “Now it’s my turn.” After Warren became ill, in the 1980s, the Moodys relocated to a seaside town in South Carolina. When not caring for her husband, Juanita planned renovations and real estate ventures and hunted antiques and secondhand jewelry. “She was a delightful lady,” Fred Nasseri, a former Iranian diplomat who moved to the U.S. after the Iranian Revolution, told me recently. Nasseri had opened a Persian rug business in nearby Litchfield, and he and Moody became friends. “We would discuss art, politics, diplomacy.” But even in retirement Moody, who died in 2015, at age 90, and was buried at Arlington National Cemetery, was discreet. When asked about her past, she would deflect. As one friend remembered her saying, “Oh, I’ve done lots of interesting things for a country girl from North Carolina.” This story was produced in partnership with Atellan Media.
8e1ae3d4a998cae07eec7f836f4dfc12
https://www.smithsonianmag.com/history/kielce-post-holocaust-pogrom-poland-still-fighting-over-180967681/
Kielce: The Post-Holocaust Pogrom That Poland Is Still Fighting Over
Kielce: The Post-Holocaust Pogrom That Poland Is Still Fighting Over The massacre started with a blood libel. That wouldn’t be unusual, except this wasn’t the Middle Ages or even Nazi Germany—it was 1946, a year after the end of World War II. A few days earlier, an 8-year-old Polish boy named Henryk Błaszczyk had gone missing from his home in Kielce, Poland, a city of 50,000 in southeastern Poland. When Henryk reappeared two days later, he told his family he had been held by a man in a basement. As his father walked him to the police station to recount his story, the boy pointed at a man who was walking near the large corner building at 7 Planty Street. He did it, Henryk said. The building, which was owned by the Jewish Committee and housed many Jewish institutions, was home to up to 180 Jews. It did not have a basement. Most of the residents were refugees, having survived the horrors of the death camps that decimated more than 90 percent of the Polish Jewish population. After the war, they had returned to their homeland with the hope that they could leave the past behind them. They had no idea they were about to become the target of anti-Semitic aggression once again—this time from the Polish neighbors they lived alongside. On the morning of July 4, a small group of state militia and local police approached the building to investigate the alleged kidnapping. As rumors of misdeeds spread, a version of the centuries-old “blood libel” that Jews were kidnapping Christian children for ritual sacrifice, a mob began to assemble. But it was the police and military who started the violence, recounts Polish historian Jan T. Gross in his 2006 book Fear: Anti-Semitism in Poland After Auschwitz. Though they were ostensibly there to protect civilians and keep the peace, officers instead opened fire and began dragging Jews into the courtyard, where the townspeople savagely attacked the Jewish residents. That day, Jewish men and women were stoned, robbed, beaten with rifles, stabbed with bayonets, and hurled into a river that flowed nearby. Yet while other Kielce residents walked by, none did anything to stop it. It wasn’t until noon that another group of soldiers was sent in to break up the crowd and evacuate the wounded and dead. In the afternoon, a group of metal workers ran toward the building, armed with iron bars and other weapons. The residents of 7 Planty were relieved; they thought these men had come to help. Instead, the metal workers began brutally attacking and killing those still alive inside the building. The violence went on for hours. As Miriam Guterman, one of the last remaining survivors of the pogrom, put it in the 2016 documentary film Bogdan’s Journey: “I couldn’t believe that these were humans.” (Guterman died in 2014.) All told, 42 Jews were killed that day at 7 Planty and around the city, including a newborn baby and a woman who was six months pregnant. Another 40 were injured. Yet beyond the horror of those physical facts, the event would take on a larger historical significance. After the Holocaust, many Jews had dreamed of returning to their native lands. Kielce shattered that dream; for Jews, Poland could never again be home. “[Kielce] really is a symbol of the exodus of Jewish survivors from Poland, and a symbol sometimes that there is no future in Poland for Jews,” says Joanna Sliwa, a historian with the Conference on Jewish Material Claims Against Germany who focuses on modern Polish Jewish history and the Holocaust. “That despite what Jews had endured during the Holocaust, and despite the fact that the local Polish population had observed all that, had witnessed all of that … Jews cannot feel safe in Poland.” Sliwa points out that Kielce was not the first post-war pogrom against Jews in Poland; smaller outbursts of violence took place the previous year in Krakow and the town of Rzeszow. In the years that followed, the Kielce pogrom—like so many atrocities committed or abetted by Poles during the war—became taboo. There were no memorials. When Bogdan Bialek, a Catholic Pole from Białystok, moved to Kielce in 1970, he sensed immediately that something was wrong. In Bogdan’s Journey, which was recently screened at an event at the Paley Center for Media in New York organized by the Claims Conference, Bialek remembers sensing a deep guilt or shame among residents when it came to talking about the pogrom. He calls this oppression of silence a “disease.” Bialek became drawn to the abscess—what Jewish historian Michael Birnbaum referred to at the event as “the looming presence of absence”—that seemed to be haunting the town. Over the past 30 years, he made it his mission to bring this memory back to life and engage today’s residents of Kielce in dialogue through town meetings, memorials and conversations with survivors. Unsurprisingly, he encountered pushback. The story of the Kielce massacre—which the film pieces together using the testimony of some of the last living victims and their descendants—is inconvenient. It challenges Poles. It opens old wounds. But for Bialek, bringing dialogue to this moment isn’t just about reopening old wounds—it is about lancing a boil. “Each of us has a tough moment in his past,” he says in the film, which was funded in part by the Claims Conference. “Either we were harmed, or we harmed someone. Until we name it, we drag the past behind us.” Since the collapse of communism in 1989, Poland has gone through a soul-searching process that has progressed in bursts, with moments of clarity but also worrisome backsliding. Polish Jews have come out of the shadows, establishing new communities and reincorporating Jews back into the country’s fabric. In the mid-2000s, reports began to emerge documenting a curious trend: a “Jewish revival” of sorts sweeping Poland and beyond. Polish Jews reclaimed their roots; Polish-Jewish book publishers and museums sprung up; once-decimated Jewish quarters began to thrive again. Part of that shift has been a reexamination of Poland’s history, Bialek said in an interview with Smithsonian.com. “We began with no understanding at all, with a kind of denial, and over time it’s been changing,” Bialek said in Polish, translated by Michał Jaskulski, one of the film’s directors. “These days it’s also easier for [Poles] to see from the perspective of the victims, which didn’t happen before. And we truly can notice how the pogrom strongly impacted Polish-Jewish relations.” But there is still work to be done, he readily admits. While Poles today don’t deny that the pogrom actually happened, they do debate who deserves responsibility for the atrocity. Conspiracy theories ran rampant when Bialek first moved to Kielce, and he reports that they are still common today. In the film, co-director Larry Loewinger interviews several older residents who claim that the riot was instigated by Soviet intelligence, or even that Jews themselves staged a massacre by dragging bodies to the scene. Unlike the better-known massacre at Jedwabne, when Poles living under Nazi control herded several hundred of their Jewish neighbors into a barn—and burned them alive—the tragedy in Kielce was borne out of post-war tensions. Poland was on the brink of civil war, its citizens were impoverished, and at the time many believed Jews were communists or spies. “You have to understand, Poland was a pretty miserable place in 1946,” says Loewinger. “It was poverty stricken. There were Jews floating around … There was a lot of anger all over.” Yet there are clear parallels. Jedwabne happened in 1941, directly after the Nazi conquest of Poland; the accepted narrative is that the killing was carried out by Poles under pressure by Nazi Germans. In Kielce, the Polish people are equally “blameless.” Both of these narratives allow Poles to cling to a national mythology of victimhood and heroism. As Polish journalist and dissident Konstanty Gebert wrote in Moment, “Raised for generations with the (legitimate) belief that theirs was a martyred nation, many Poles found it increasingly hard to accept that their victimhood did not automatically grant them the moral high ground when it came to their behavior toward Jews during the Holocaust.” Moreover, says Silwa, “Both of these events show how dangerous these conspiracy theories are, and how these myths about the so-called other, the blood libel, and … equating Jews with Communism, can turn into mob-like violence.” In a 2016 television interview, Poland’s education minister Anna Zalewska appeared to deny Polish responsibility for any involvement in both of these historical events. When asked directly, “Who murdered Kielce’s Jews during the town pogrom?” she was unable to answer the question. She demurred, before finally answering: “Anti-Semites.” She did not admit that these anti-Semites were Poles. When controversy erupted, Zalewska received support from Foreign Minister Witold Wszczykowski, who said her comments had been “misunderstood.” “It has to do with the Polish government, the effort to in a way rewrite history,” says Sliwa. “To put more emphasis on heroism and patriotism of the Polish nation during the war and after the war. It seems like it is an attempt to take hold over, to control, how the past is narrated.” The concern that Poland is rewriting its history feels more relevant now than ever. Ever since the 2015 victory of the Law and Justice (Prawo i Sprawiedliwość) party, the right-wing populist party led by Jarosław Kaczyński, the government has pursued what is openly referred to as polityka historyczna, or “history policy.” Journalists and historians like Sliwa, however, call it “politicized history.” Of course, she adds, “there was discussion about this even before Law and Justice came to rule Poland. But now that taken over, it’s become so public and acceptable. And official, really official.” You can see traces of this “history policy” in how the Kielce story has evolved over time. Despite the facts Gross and others have detailed, a 2004 report by the Institute of National Remembrance (IPN)—a state research institute that examines crimes committed by the Nazi and communist regimes and routinely minimizes Poland’s role in the Holocaust—concluded that the Kielce pogrom was the result of a “mishap.” This year, the Polish government backed legislation that would criminalize the use of the phrase “Polish death camps,” stating that the phrase wrongly implicated Poles as the orchestrators of Auschwitz and other Nazi death camps. At the same time, Poland’s far right groups have grown emboldened. The largest demonstration of anti-immigrant and fascist attitudes coalesced in November of last year, on the country’s official Independence Day. The celebration, which has become an annual rallying point for Poland’s far-right groups, saw more than 60,000 demonstrators march through Warsaw calling for “White Europe.” Some threw red smoke bombs or carried banners with white supremacist symbols or phrases like “Clean blood.” Others chanted “Pure Poland, white Poland!” and “Refugees get out!” The ruling party has long stoked fear of Muslim refugees, with Kaczyński saying in 2015 that migrants brought “dangerous diseases” including “all sorts of parasites and protozoa.” In 2017, Poland refused to take in refugees despite the European Union's threats to sue. Poland has also seen an upswing in racially motivated violence toward foreigners, with Muslims and Africans the most frequent targets of attacks. In 2016, Polish police investigated 1,631 hate crimes fueled by racism, anti-Semitism or xenophobia. To Bialek, these attitudes are a scary echo of what happened in 1946, and 1945. Worse, he fears they are a harbinger of things to come. “I keep on saying that for the last couple of years that these things may come back,” says Bialek. “When there are these examples of hostility of people in Poland toward foreigners, because they speak in different language, because they have darker skin, when these things happen—to me the most terrifying thing is the indifference. It is to have people who see these things do nothing about it.” He continues: “When you’re referring to this ‘Independence’ march, the authorities would say that people who carry these wrong texts on their banners were a minority. Even if this was true, no one did anything about it. The authorities allow these things.” With Bogdan’s Journey, the filmmakers strive to keep the memory of another time the authorities did nothing—and in fact aided in an atrocity—fresh in Poles’ minds. The film premiered in summer 2016 at the POLIN Museum of the History of Polish Jews in Warsaw; last month it began screening nationally for the first time. While it has been generating positive interest in Polish media, there have also been accusations online that resurface the Soviet conspiracy theories and claim the film is deliberately misleading. The film anticipates just such a response. “The disgrace of the pogrom will never disappear. It is a historical fact,” Bialek says in it. He only hopes that, “With time, the world will remember not only the pogrom in Kielce, but also that Kielce has tried to do something about it." Rachel is the Science Editor, covering stories behind new discoveries and the debates that shape our understanding of the world. Before coming to Smithsonian, she covered science for Slate, Wired, and The New York Times.
cd075885229167846836f8de551882a7
https://www.smithsonianmag.com/history/kon-tiki-sails-again-5404357/
Kon-Tiki Sails Again
Kon-Tiki Sails Again The most harrowing scene in Kon-Tiki, the new Oscar-nominated Norwegian film about the greatest sea voyage of modern times, turns out to be a fish story. In the 2012 reconstruction of this 1947 adventure, six amateur Scandinavian sailors—five of whom are tall, slim and valiant—build a replica of an ancient pre-Incan raft, christen it Kon-Tiki and sail westward from Peru along the Humboldt Current for French Polynesia, more than 3,700 nautical miles away. In mid-passage, their pet macaw is blown overboard and gobbled up by a big bad shark. During the scene in ques- tion, one of the tall and slim and valiant is so enraged by the bird’s death that he thrusts his bare hands into the Pacific, hauls in the shark and guts it with a savagery that would have made Norman Bates envious. [×] CLOSE Photo Gallery The shark’s blood seeps through the balsa timbers of the Kon-Tiki, inciting a feeding frenzy down below. Meanwhile, the sixth crewmate—this one short, plump and craven—slips off the edge of the raft, which can neither stop nor turn back. As it drifts away from the drowning fat man, his slim companions frantically distract the crazed sharks with chunks of flesh. Then one seaman plunges to the rescue bearing a life belt secured to the raft by a long line. After several stomach-churning seconds Skinny reaches Fatty, and the others yank them in before they become Shark Bites. It hardly matters that there never was a fat guy or a vengeful seaman, and that the munched macaw was really a parrot that vanished without drama into salt air. Like Lincoln, the film takes factual liberties and manufactures suspense. Like Zero Dark Thirty, it compresses a complex history into a cinematic narrative, intruding on reality and overtaking it. The irony is that the epic exploits of the Kon-Tiki’s crew had once seemed untoppable. From the get-go, anthropologist Thor Heyerdahl, the expedition’s charismatic and single-minded leader, had touted the voyage as the ultimate test of nerve and endurance. His daring travel adventure sparked a spontaneous media circus that turned him into a national hero and a global celebrity. In Heyerdahl’s 1950 Kon-Tiki, Across the Pacific by Raft—a lively chronicle that sold more than 50 million copies and was translated into nearly 70 languages—and his 1950 Academy Award-winning documentary Kon-Tiki, the sailors were presented as 20th-century Vikings who had conquered the vast, lonely Pacific. The new movie elevates them from Vikings to Norse gods. “Thor had a special feeling of greatness about him,” says Jeremy Thomas, one of the film’s producers. “He was more than merely brave and courageous: He was mythic.” Kon-Tiki is a gloss on a man whose towering self-regard allowed him to ignore critics who insisted he was on a suicide mission. Was the voyage a genuine scientific breakthrough or a rich kid’s diversion? By making Heyerdahl mythic and sidestepping the shifting layers of truth in his feats and scholarship, the filmmakers beg a reappraisal of his perch in popular consciousness. *** The myth of the Kon-Tiki begins during the late 1930s on the South Pacific island of Fatu Hiva, in the Marquesas chain. It was there that Heyerdahl and his new bride, Liv, took a yearlong honeymoon to research the origins of Polynesian animal life. While lying on a beach, gazing toward America, the University of Oslo-trained zoologist listened to a village elder recite the legends of his ancestors, towheaded men who arrived with the sun from the east. Their original home was high in the clouds. Their chieftain’s name was Tiki. To Heyerdahl, the people described by the village elder sounded a lot like the fair-skinned Peruvians who were said in oral tradition to have lived by Lake Titicaca before the Incans. Ruled by the high priest and sun king Con-Tiki, they built temples with huge stone slabs quarried on an opposite shore and ferried across the water on balsa rafts. Supposedly, a turf war had wiped out most of the white race. Con-Tiki and a few companions escaped down the coast, eventually rafting westward across the ocean. Heyerdahl hypothesized that Tiki and Kon-Tiki were one and the same, and the source of Pacific cultures was not Asia, as orthodox scholars held, but South America. It was no mere coincidence, he said, that the huge stone figures of Tiki on this Polynesian island resembled the monoliths left by pre-Incan civilizations. His radical conclusion: The original inhabitants of Polynesia had crossed the Pacific on rafts, 900 years before Columbus traversed the Atlantic. The scientific community dismissed Heyerdahl’s findings. Fellow academics claimed humans could never have survived the months of exposure and privations, and that no early American craft could have weathered the violence of the Pacific’s storms. When Heyerdahl failed to interest New York publishers in his manuscript, the evocatively titled “Polynesia and America: A Study of Prehistoric Relations,” he decided to test his theories of human migration by attempting the journey himself. He vowed that if he pulled it off, he’d write a popular book. Heyerdahl’s father, the president of a brewery and a mineral water plant, wanted to bankroll the expedition. But his plans were scuttled by restrictions on sending Norwegian kroner out of the country. So the younger Heyerdahl used his considerable powers of persuasion to scrounge the money ($22,500). He then put out a call for crew members: “Am going to cross the Pacific on a wooden raft to support a theory that the South Sea islands were peopled from Peru. Will you come? Reply at once.’’ Four Norwegians and a Swede were game. Though the recruits knew Heyerdahl, they didn’t know one another. Most were intimate with danger as members of Norway’s wartime underground. They had either been spies or saboteurs; Heyerdahl himself had served as a paratrooper behind Nazi lines. Curiously, he could barely swim. Having twice almost drowned as a boy, he had grown up terrified of water. Heyerdahl and countryman Herman Watzinger flew to Lima and, during the rainy season, crossed the Andes in a jeep. In the Ecuadorean jungle, they felled nine balsa trees and floated them downriver to the sea. Using ancient specs gleaned from explorers’ diaries and records, the crew patiently assembled a raft in the naval harbor of Callao. The Kon-Tiki ran against every canon of modern seamanship. Its base—made of balsa logs ranging in length from 30 to 45 feet—was lashed to crossbeams with strips of hand-woven Manila rope. On top was laid a deck of bamboo matting. The raft’s small half-open cabin of bamboo plaits and leathery banana leaves was too low to stand in. A bipod mast was carved of mangrove, hard as iron. The square sail, bearing a likeness of the sun god, was set on a yard of bamboo stems, bound together; the helm was a 15-foot-long mango wood steering oar. For verisimilitude, this weird vegetable vessel was constructed without spikes, nails or wire—all of which were unknown to pre-Columbian Peruvians. Though ignorant of the Incan art of steering, Heyerdahl was well aware of the perils awaiting an open raft with no more stability than a cork. (Balsa is, in fact, less dense than cork.) Skeptics—including National Geographic magazine, which declined to sponsor the expedition—treated Heyerdahl like he was on a dice roll with death. So-called experts predicted that the balsa would quickly break under the strain; that the logs would wear through the ropes or get waterlogged and sink; that the sail and rigging would be stripped by sudden, screaming winds; that gales would swamp the raft and wash the crew overboard. A naval attaché bet all the whiskey the crew members could drink over the rest of their lives that they’d never make it to the South Seas alive. Despite the warnings, the six men and their parrot, Lorita, put to sea on April 28, 1947. Drifting with the trade winds, riding heavy swells, the unwieldy Kon-Tiki proved astonishingly seaworthy. Rather than chafe the Manila rope lashings, the balsa logs became soft and spongy, leaving the rope unharmed and effectively protecting it. Water swept over the raft and through the logs as if passing through the prongs of a fork. The floating prefab progressed through the southern latitudes at an average rate of 37 nautical miles a day. According to Heyerdahl’s account, when the seas were really rough and the waves really high—say, 25 feet—the helmsmen, sometimes waist deep in water, “left the steering to the ropes and jumped up and hung on to a bamboo pole from the cabin roof, while the masses of water thundered in over them from astern. Then they had to fling themselves at the oar again before the raft could turn around, for if the raft took the seas at an angle the waves could easily pour right into the bamboo cabin.” Among the post-Incan furnishings provided by the U.S. military were tinned food, shark repellent and six-watt transmitters. “Heyerdahl knew the value of good marketing,” offers Reidar Solsvik, curator of the Kon-Tiki Museum in Oslo. “He only allowed one navigator in his crew, but he made sure his raft had five radio sets.” Heyerdahl’s radioman broadcast daily progress reports to ham operators, who relayed the messages to a press as ravenous as bird-eating sharks and a postwar public eager to embrace overnight heroes. “The general public was enthralled,” says Jeremy Thomas. “Much of western civilization lay in ruins, and the Kon-Tiki took all the hardship off the front pages.” Newspapers around the world charted the path of the daredevil explorers as if they were orbiting the moon. “Heyerdahl was a great storyteller, but his true genius was in PR,” says Joachim Roenning, who directed the new film with his childhood friend Espen Sandberg. “The voyage of the Kon-Tiki was the world’s first reality show.” Aboard the raft, the 20th-century Argonauts supplemented their G.I. rations with coconuts, sweet potatoes, pineapples (they had stashed away 657 cans), water stored in bamboo tubes and the fish they caught. During long lulls, they entertained themselves by baiting the ever-present sharks, snatching them by the tails and hoisting them aboard. Dozens of them. In the documentary assembled from footage Heyerdahl shot with his trusty 16-mm camera, a crew member dangles a mahi-mahi over the side of the raft and a shark pops up, snaps its jaws and takes half of the fish with it. “Just a childish game to relieve boredom,” says Heyerdahl’s eldest son, Thor Jr., a retired marine biologist. “For Norwegians, the concept of ‘conversation’ probably didn’t exist in those days.” It would be three months before land was sighted. The Kon-Tiki passed several of the outlying islands of the Tuamotu Archipelago, and after 101 days at sea, was pushed by tail winds to a jagged coral reef. Rather than risk running the raft aground, Heyerdahl ordered the sail lowered and centerboards up. Anchors were rigged from the mast. A swell lifted the Kon-Tiki high and flung it in the shallows beyond the roaring breakers. The cabin and mast collapsed, but the men hung onto the main logs and emerged mostly unharmed. They straggled ashore on Raroia, an uninhabited atoll in French Polynesia. The flimsy Kon-Tiki had traveled more than 3,700 nautical miles. Heyerdahl’s book would inspire a pop phenomenon. Kon-Tiki begat Tiki bars, Tiki motels, Tiki buses, Tiki sardines, Tiki shorts, Tiki cognac, Tiki chardonnay, vanilla-cream Tiki wafers and a tune by the Shadows that topped the British singles charts. This year marks the 50th anniversary of the Enchanted Tiki Room, a Disneyland attraction that features Tiki drummers, Tiki totem poles and a flock of tropical Audio-Animatronic birds singing “The Tiki Tiki Tiki Room.” Looming in the dim light, a colossal whale shark gambols in the briny deep. The 30-foot creature, a plastic model of one that darted playfully beneath the Kon-Tiki and threatened to upend it, is suspended from the basement ceiling of the museum. Many a kid who grew up in or visited Oslo has stood in the semidarkness and marveled at the monster and imagined its fearful snort. In the museum’s diorama, the ocean stretches on forever. Joachim Roenning and Espen Sandberg first glimpsed the whale shark when they were 10 years old. But what really caught their eye was the shiny gold idol that reposed in a glass case one floor above: Heyerdahl’s Oscar. “For us,” says Sandberg, “that was even bigger than the whale shark.” Growing up in Sandefjord, a small town south of Oslo, Sandberg and Roenning didn’t read and reread Kon-Tiki to learn about migration theory. “We wanted to be part of Heyerdahl’s adventure,” says Roenning. “As a Norwegian, he fascinated us. He was ambitious and unafraid to admit it, which is not very Norwegian.” Heyerdahl never veered from the course he set. In the wake of the Kon-Tiki, he pursued and promoted his controversial theories. He led cruises aboard the reed rafts Ra, Ra II and Tigris. He conducted fieldwork in Bolivia, Ecuador, Colombia and Canada. In Peru, he unearthed raft centerboards that he believed suggested return voyages from Polynesia against the wind might have been possible. For a half-century, Heyerdahl refused to go to Hollywood. Many deep-pocketed producers came calling about Kon-Tiki. “All were kicked out to sea,” says Sandberg. “I think Thor was afraid of becoming the Kon-Tiki Man. He wanted to be judged on his body of work.” Then one day in 1996 Jeremy Thomas showed up on the doorstep of Heyerdahl’s home in the Canary Islands. The British impresario had an Oscar under his belt—for Bernardo Bertolucci’s The Last Emperor (1987)—and a story pitch on his lips. “In my imagination,” he says, “Kon-Tiki was about six hippies on a raft.” When Heyerdahl, then 81, resisted, the 47-year-old Thomas persisted. He enlisted the aid of Heyerdahl’s third wife, Jacqueline, a former Miss France who’d appeared in a tranche of American movies (Pillow Talk, The Prize) and TV shows (“Mister Ed,” “The Man From U.N.C.L.E.”). On Thomas’ third trip to the Canaries, Heyerdahl caved and signed over the rights. It wasn’t necessarily that Thomas’ counter­cultural vision had won him over. “Thor was short on expedition funding for one of his wilder theories,” says Reidar Solsvik. Heyerdahl believed that the Viking god Odin may have been a real king in the first century B.C. He used at least some of the money to search in southern Russia for evidence of Odin, who ruled over Asgard. Thomas sought funding, too. He hoped to mount Kon-Tiki as an English-language blockbuster with a $50 million budget. He sent a series of big-name screenwriters to confer with Heyerdahl, whose own script was rejected out of hand. Reportedly, Melissa Mathison of E.T.: The Extraterrestrial fame wrote a draft. Jacqueline remembers accompanying her husband to a screening of Raiders of the Lost Ark, which starred Mathison’s then husband, Harrison Ford. “Thor was not impressed by Indiana Jones,” Jacqueline says. “They had different approaches to archaeology.” Who would play Heyerdahl? Lots of names were tossed around: Ralph Fiennes, Kevin Costner, Brad Pitt, Jude Law, Christian Bale, Leonardo DiCaprio and, Jacqueline’s personal favorite, Ewan McGregor. Basically, any big-name actor who could pass as a blond. But even with Phillip Noyce (Patriot Games) aboard to direct, financing proved difficult. “Potential backers thought moviegoers wouldn’t be interested in the voyage because no one had died,” Thomas says. “You can’t make an adventure film about fishing and sunbathing.” The poor parrot Lorita would have to be sacrificed for art. Before Heyerdahl’s death in 2002, Thomas reduced the movie’s scale and brought in Norwegian writer Petter Skavlan to reshape Kon-Tiki as a contemporary Norse tale. Noyce bowed out and was replaced by Roenning and Sandberg, whose 2008 World War II thriller Max Manus is Norway’s highest-grossing film ever. Instead of filming on the high seas of Australia and Fiji, as Thomas had planned, the shooting location was moved to the Mediterranean island of Malta, where the costs were lower and the sea was flat. The budget shrunk to $15 million, petty cash by Hollywood standards. The Scandinavian cast did multiple takes in Norwegian and English. “I wanted more than 12 people to see the film,” Thomas said. In Norway, they already have: Kon-Tiki has already grossed some $14 million at the box office. When discussing the movie, Thomas tends to sounds like a marketing guru who’s brought a dormant product back to life. “Celebrities like Marilyn Monroe and James Dean are still hot largely because they died young,” he says. “Heyerdahl got cold because he died very old. The new film will help invigorate his brand.” Initially, the repackaging troubled Thor Jr. He objects to the depiction of crewmate Herman Watzinger. In real life, Watzinger was a plucky refrigeration engineer who resembled Gregory Peck. In the film, he’s a gutless, beer-gutted refrigerator salesman known to sharks as Lunch. “I regret that the filmmakers used Herman’s name,” says Thor Jr. “I understand why they needed a character who represented human weakness, but they should have called him Adam or Peter.” Watzinger’s 70-year-old daughter, Trine, was not amused. Before the picture premiered last summer in Oslo, she complained to the Norwegian press. Accused of “character assassination,” the filmmakers tried to mollify Trine with the idea that Watzinger redeems himself at the end of the movie—his nifty scheme involving wave patterns propels the Kon-Tiki through the rollers. Still, she refused to attend the premiere. “A disclaimer has been inserted at the end of the DVD,” Thor Jr. says. “Of course, you have to sit through the closing credits to see it.” His other concern was the aggressively romantic ending. On the beach in Raroia, a crewmate hands Thor Sr. a Dear Johan letter from Liv. In a voice-over, she selflessly explains why she’s dumping him: Unencumbered by family, he’ll be free to chase impossible dreams. The camera cuts from Liv—turning away from the sun and walking toward their house in the mountains of Norway—to Thor, squinting into the sun and toward the glowing sail of the Kon-Tiki. *** As it turns out, reality was a bit more complex. “There was no letter,” reports Thor Jr. His mom, he says, never quite forgave his dad for squelching her possible dreams on their honeymoon in the Marquesas. Liv wanted to be seen as half of a research team, but Thor insisted on taking all the credit. “My father couldn’t cope with her being such a strong, independent woman,” says 74-year-old Thor Jr., who was estranged from his old man for much of his youth. “His idea of the perfect female was a Japanese geisha, and my mother was no geisha.” A month after the Kon-Tiki made landfall, the Heyerdahls arranged to reunite at an airport in New York. He would fly from Tahiti; she, from Oslo. He was waiting on the tarmac when her plane landed. “She was eager to embrace him,” Thor Jr. says. But she could barely pierce the phalanx of photographers that encircled him. Liv was furious. “She had been set up,” Thor Jr. says. “An intimate private meeting had become a public performance. She gave my father a very cold hug.” Thor Sr. felt humiliated. He and Liv divorced a year later. Heyerdahl’s migration ideas haven’t fared much better than his first marriage. Though he enlarged our notions of the early mobility of humans, his Kon-Tiki theory has been widely discredited on linguistic and cultural grounds. He was partly vindicated in 2011 when Norwegian geneticist Erik Thorsby tested the genetic makeup of Polynesians whose ancestors had not interbred with Europeans and other outsiders. Thorsby determined that their genes include DNA that could have only come from Native Americans. On the other hand, he was emphatic that the island’s first settlers came from Asia. “Heyerdahl was wrong,” he said, “but not completely.” A longtime senior writer at Sports Illustrated and the author of several memoirs, Franz Lidz has written for the New York Times since 1983, on travel, TV, film and theater. He is a frequent contributor to Smithsonian.
265de64bb47167e381bcdca3ae99df13
https://www.smithsonianmag.com/history/learn-origins-term-affirmative-action-180959531/
The Origins of the Term “Affirmative Action”
The Origins of the Term “Affirmative Action” For a term as loaded with political meaning as “affirmative action,” it might come as a surprise to learn that its origins on the political landscape still remain somewhat of a mystery. Merriam-Webster places its first known use in 1965, but the historical record shows it being used years before. This week, the term is in the news because the Supreme Court may reverse course on an almost 40-year-old ruling that declared race-based affirmative action constitutional in Regents of the University of California v. Bakke (1978). Court watchers are predicting that the suit challenging the use of racial preference as a factor in the college admissions process, Fisher v. University of Texas II, will end in a 4-3 dissent against affirmative action (Elena Kagan has recused herself from the case after working on it as U.S. solicitor general). Justice Anthony Kennedy, the expected swing vote, “does not like affirmative action and has never voted to affirm it,” as Garrett Epps put it for The Atlantic in December 2015, when the court heard oral arguments in the case, which is actually a re-hearing of a case originally brought before the Court in 2008. (Hence the Roman numeral.) The Court last upheld affirmative action in admission decisions in 2003 in Grutter v. Bollinger. The case in question today began when Abigail Fisher, a white high school student, sued the University of Texas at Austin after being denied admission to the school, arguing that the school's affirmative action policy violates her 14th Amendment rights under the equal protection clause. In Texas, students that rank in the top 10 percent of their public high schools are guaranteed a spot at UT-Austin. Fisher, who came in the top 12 percent of her class, missed the mark. The rest of the student population goes through a regular admissions process that considers race and ethnicity as factors. Depending on how broadly the court rules, Fisher II could reverse Bakke in what would be a “disastrous blow to proponents of race-based affirmative action,” Elton Lossner writes for the Harvard Political Review. Though education is largely the focus of today’s affirmative action debate, the origin of the term is rooted with legalese in employment law, explains Shirley J. Wilcher, the executive director for the American Association for Access, Equity and Diversity. To take an "affirmative action" was to literally act affirmatively—not allowing events to run their course but rather having the government (or employers) take an active role in treating employees fairly. Most prominent among the early sightings of the phrase "affirmative action" is its presence in the National Labor Relations Act of 1935. Better known as the Wagner Act, the legislation established the National Labor Relations Board and collective bargaining, as well as decreeing that employers found practicing discriminatory labor laws would be required “...to take such affirmative action including reinstatement of employees with or without backpay...”. The race-based affiliation of this phrase hadn’t been codified yet. Employers reacted with hostility to the new law and called the NLRB biased toward laborers. “Employers almost universally did not welcome the Act”, said NLRB chairman J. Warren Madden at the time. The Supreme Court ruled that the Wagner Act was constitutional in 1937. Four years later, on the cusp of U.S. involvement in World War II, civil rights activist A. Philip Randolph led the nationwide effort protesting African-Americans contributing to the war effort while still being subject to Jim Crow segregation laws at home. This March on Washington Movement planned a demonstration on the U.S. Capitol grounds for for July 1, 1941. As many as 100,000 people were expected to show up, writes BlackPast.org. On June 25, 1941, days before the planned march, President Franklin Roosevelt issued Executive Order 8802, which created the first Fair Employment Practices Committee (FEPC) and forced defense contractors “....to provide for the full and equitable participation of all workers in defense industries, without discrimination... .” While EO 8802 didn’t use the term “affirmative action,” it was the first presidential order to lay the groundwork for later implementations of this public policy. Victory in hand, the movement cancelled its march. But by 1945, despite progress, industrial intolerance remained deep-rooted. Chester Bowles, the committee chairman of the FEPC wrote a letter to The New York Times, criticizing the executive order as just a plug to fix the leak: American minority groups have made gains in the war industry and in government service during the last four years. Old prejudices have been gradually broken down and old customs swept aside, but the roots of the problem of industrial intolerance go deep and we have still a long way to go. President Dwight D. Eisenhower would build on FDR's work with the 1953 Executive Order 10479, which created the anti-discrimination Committee on Government Contracts. But President John F. Kennedy would become the first president to marry the term “affirmative action” with its modern-day connotation of a policy seeking to ensure racial equality.  On May, 6, 1961, in Executive Order 10925, he called on government contractors to "...take affirmative action to ensure that applicants are employed and that employees are treated during employment without regard to their race, creed, color, or national origin." However, the order did not specify what such actions would entail. It would be Kennedy’s Committee on Equal Employment Opportunity, which instituted the Plans for Progress (PfP) program that paved the way for Affirmative Action, says Wilcher. The PfP was made up of a voluntary association of more than 400 of the nation's largest industrial employers who practiced equal opportunity programs, as Anthony S. Chen writes in his book, The Fifth Freedom: Jobs, Politics, and Civil Rights in the United States 1941-1972. During President Lyndon Johnson’s administration, the phrase “affirmative action” found its legs. As Google’s Ngram viewer illustrates, the words would spike in the American lexicon after Johnson issued Executive Order 11246 on September 28, 1965. The order demanded that contractors "take affirmative action to ensure that applicants are employed, and that employees are treated during employment, without regard to their race, color, religion, sex or national origin." And, in order to ensure this, in 1966, Johnson then established the Office of Federal Contract Compliance Programs in the U.S. Department of Labor. Johnson’s work on affirmative action would be furthered by President Richard Nixon whose Executive Order 11478 passed in August 8, 1969, and called for unilateral affirmative action in all government employment. Meanwhile, the next chapter of affirmative action would expand toward education, starting with the Supreme Court’s Green v. County School Board of New Kent County ruling in 1968, which mandated that all school boards had to provide a plan to end segregated systems in their district, in order to be in compliance with Brown v. Board of Education (1954). The order would become a rallying point for conservatives and liberals, alike. As the Virginia Historical Society writes: Because of white flight to private academies and to the suburbs, racial balance could not be achieved in many city schools without extensive busing of students citywide or across city-county boundaries. This set the stage for a sharp white backlash against social engineering by the judiciary and a strengthening of conservative political opinion. This pushback would come to the attention of the Supreme Court in 1978 with Bakke. The lawsuit was filed by Allan Bakke, a white applicant to University of California, Davis’ medical school, who had been denied admission to the school two times despite having MCAT scores and a GPA higher than candidates who had been admitted to the program. The medical school at that time reserved 16 out of 100 spots for minorities. In a 5-4 decision, the Supreme Court ruled that while quotas violated the 14th Amendment’s Equal Protection Clause, race could be used as a factor in applications to promote diversity in education. Bollinger, which came to the Supreme Court’s docket in 2003, relied on Bakke. The case centered around Barbara Grutter, a white applicant attention to the University of Michigan’s law school. The school’s admission process did not have quotas, but did look favorably upon minority applicants.  In another 5-4 decision, the court ruled that the university’s case-by-case consideration of applicants that included race as one narrow factor in its decision-making, made its admissions process legal. As the court readies to rule on Fisher II, it’s uncertain where affirmative action will stand in higher education after this week. Perspectives range across the ideological spectrum on its purpose and effectiveness. The conservative viewpoint was best epitomized by the now-famous phrasing used by Chief Justice John Roberts wrote in the plurality opinion striking down a Seattle plan to integrate students by assigning them to schools Parents Involved in Community Schools v. Seattle School District in 2007, “The way to stop discrimination on the basis of race, is to stop discriminating on the basis of race.” For Wilcher, she sees affirmative action as a pillar of civil rights legislation.  “Affirmative action has taken on negative connotations through the media and those that would like to do away with it or oppose the concept, but the impetus is on action, not nondiscrimination,” says Wilcher. “You have got to show that you tried, and that’s what affirmative action under the Johnson order means that’s what it meant in 1965, and that’s what it means today.” However the court rules, the term’s origins in the presidential executive order continue on. Today, protected classes for federal contractors under Johnson’s Executive Order 11246 now include race, color, religion, sex, national origin, as well as sexual orientation or gender, after President Barack Obama signed an Executive Order adding those classes to the list in 2014. Jacqueline Mansky is a freelance writer and editor living in Los Angeles. She was previously the assistant web editor, humanities, for Smithsonian magazine.
d5b24cd0b72f0c1fd49aaf191a655cc7
https://www.smithsonianmag.com/history/legacy-black-hawk-down-180971000/
Twenty-five years ago, I was drawn to Somalia in the aftermath of Operation Restore Hope, a U.S. initiative supporting a United Nations resolution that aimed to halt widespread starvation. The effort, started in 1992, secured trade routes so food could get to Somalis. The U.N. estimated that no fewer than 250,000 lives were saved. But Operation Restore Hope would be best remembered in the United States for a spectacular debacle that has shaped foreign policy ever since. Almost right away, militias led by the Somali warlord Mohamed Farrah Aidid began attacking and killing U.N. peacekeepers. On October 3 and 4, 1993, U.S. forces set out on a snatch-and-grab mission to arrest two of Aidid’s lieutenants. The plan was to surround a white three-story house in the capital city of Mogadishu where leaders of Aidid’s Habar Gidir clan were gathering. Rangers would helicopter in, lower themselves on ropes and surround the building on all sides. A ground convoy of trucks and Humvees would wait outside the gate to carry away the troops and their prisoners. Altogether, the operation would involve 19 aircraft, 12 vehicles and around 160 troops. The operation didn’t go as planned. The ground convoy ran up against barricades formed by local militias. One helicopter landed a block north of its target and couldn’t move closer because of groundfire. A ranger fell from his rope and had to be evacuated. Insurgents shot down two American Black Hawk helicopters with rocket-propelled grenades. When about 90 U.S. Rangers and Delta Force operators rushed to the rescue, they were caught in an intense exchange of gunfire and trapped overnight. Altogether, the 18-hour urban firefight, later known as the Battle of Mogadishu, left 18 Americans and hundreds of Somalis dead. News outlets broadcast searing images of jubilant mobs dragging the bodies of dead Army special operators and helicopter crewmen through the streets of Mogadishu. The newly elected U.S. president, Bill Clinton, halted the mission and ordered the Special Forces out by March 31, 1994. For Somalis, the consequences were severe. Civil war raged—Aidid himself was killed in the fighting in 1996—and the country remained lawless for decades. Pirate gangs along the country’s long Indian Ocean coastline menaced vital shipping lanes. Wealthy and educated Somalis fled. When I visited Somalia for the first time, in 1997, the country was well off the map of world interest. There were no commercial flights to the capital city, but each morning small planes took off from Wilson Airport in Nairobi, Kenya, for rural landing strips throughout the country. My plane was met by a small platoon of hired gunmen. On our way into the city, smaller bands of brigands grudgingly removed barriers that had been stretched across the dirt road to halt traffic. The driver of my vehicle tossed fistfuls of near-worthless paper Somali shillings as we passed these local versions of tollbooths. The city itself was in ruins. The few large buildings were battle-scarred and filled with squatters, whose fires glowed through windows empty of glass and stripped of aluminum frames. Gas generators banged away to provide power to those few places where people could afford it. Militias fought along the borders of city sectors, filling the hospitals with bloody fighters, most of them teenagers. The streets were mostly empty, except for caravans of gunmen. Without government, laws, schools, trash pickups or any feature of civil society, extended clans offered the only semblance of safety or order. Most were at war with each other for scarce resources. I described this wasteland in my 1999 book about the Battle of Mogadishu and its aftermath, Black Hawk Down (the basis of the 2001 movie directed by Ridley Scott). When I returned to the States and spoke to college audiences about the state of things in Somalia, I would ask if there were any anarchists in the crowd. Usually a hand or two went up. “Good news,” I told them, “you don’t have to wait.” The consequences were felt in America, too. After Mogadishu, the United States became wary of deploying ground forces anywhere. So there was no help from America in 1994 when Rwandan Hutus slaughtered as many as a million of their Tutsi countrymen. Despite a global outcry, U.S. forces stayed home in 1995 as Bosnian Serbs mounted a genocidal campaign against Muslim and Croatian civilians. That isolationism ended abruptly on September 11, 2001. But even as Presidents George W. Bush and Barack Obama sent troops to Iraq and Afghanistan, they kept their distance from the Islamic insurgents in Somalia. During the last two years of the Obama administration, there were only 18 airstrikes (both drones and manned) on Somalia. Now things are changing. In the past two years, U.S. forces have conducted 63 airstrikes on targets in Somalia. The number of American forces on the ground has doubled, to about 500. And there have already been fatalities: a Navy SEAL, Senior Chief Special Warfare Operator Kyle Milliken, was killed in May of 2017 assisting Somali National Army troops in a raid about 40 miles west of Mogadishu, and Army Staff Sgt. Alexander Conrad was killed and four others wounded in June of this year during a joint mission in Jubaland. All of this might raise the question: What do we expect to achieve by returning to Somalia? After years of turmoil in Afghanistan and Iraq, why should we expect this mission to be any different? * * * A casual visitor to Mogadishu today might not see an urgent need for U.S. ground troops. There are tall new buildings, and most of the old shanties have been replaced by houses. There are police, sanitation crews and new construction everywhere. Peaceful streets and thriving markets have begun to restore the city to its former glory as a seaside resort and port. Somali expatriates have begun reinvesting, and some are returning. The airport is up and running, with regular Turkish Airlines flights. Brig. Gen. Miguel Castellanos first entered Mogadishu as a young Army officer with the Tenth Mountain Division in 1992, looking down from the open door of a Black Hawk helicopter. He is now the senior U.S. military officer in Somalia. “I was pretty surprised when I landed a year ago and there was actually a skyline,” he told me. Somalia largely has its neighbors to thank for this prosperity. In 2007, African Union soldiers—mostly from Uganda but also from Kenya, Ethiopia, Burundi, Djibouti and Sierra Leone—began pushing the extremist group the Shabab out of the country’s urban centers with an effort dubbed the African Union Mission to Somalia (AMISOM). The United States lent support in the form of training and equipment. Turkey and the United Arab Emirates have taken advantage of the newfound peace and bankrolled development of Somalia’s port cities. The problem is in the rural areas. There, basic security depends almost entirely on local militias whose loyalties are tied to clans and warlords. “There is a real black-and-white, good and evil struggle in Somalia,” said Stephen Schwartz, who served as U.S. ambassador there until the end of September 2017. “The forces of chaos, of Islamist extremism, are powerful and have decades of inertia behind them in criminality, the warlords and cartels.” If current conditions persist, the Shabab, Al Qaeda’s affiliate in East Africa, could end up controlling large parts of the country, says Abdullahi Halakhe, a security consultant for the Horn of Africa who previously worked for the U.N. and the BBC. “They would be running their own schools, their own clinics, collecting trash. That is where the appeal of this group comes.” So far, the United States has been dealing with this threat with a string of targeted killings. Top Shabab leaders were killed by U.S. raids and airstrikes in 2017 and 2018. But the experts I spoke with told me these hits may not ultimately accomplish much. “Killing leaders is fine, makes everybody feel good; they wake up in the morning, big headline they can quantify—‘Oh we killed this guy, we killed that guy’—but it has absolutely no long-term effect and it really doesn’t have any short-term effect either,” said Brig. Gen. Don Bolduc, who until last year commanded Special Operations in Africa and directly oversaw such efforts. “Someone is always going to be there to be the next leader.” Every expert I spoke with recommended investing in rebuilding the country instead. This approach didn’t work well in Afghanistan, but there are differences. Somalia’s president, Mohamed Abdullahi Mohamed, is friendly to the United States—and he was chosen by his own people, not installed by the U.S. Somalia’s Islamist extremists no longer enjoy broad ideological support. “There was a time when the Shabab could transcend all the regional clan differences and project this kind of Pan Somalia, Pan Islam type of image,” said Halakhe. “That is gone.” The country’s problems are mostly economic, says Bolduc, and solving them would cost so much less than the trillions spent in Afghanistan and Iraq that the question doesn’t fall into the same category. He points to success in Puntland, Somalia’s northernmost member state. In 2017, Bolduc and his special forces worked with the state’s president, Abdiweli Mohamed Ali Gaas, and with American diplomats to assemble local forces and tribal elders. They trained the Puntland militias but offered no air or ground support. Working entirely on their own, Somali forces moved from southern Puntland up to a northern port where the Islamic State (a rival of the Shabab) had established control. They took back everything and secured it in about a week. “ISIS East Africa has not been able to get a foothold back into these areas,” says Bolduc. “And those villages are holding today.” Schwartz says this success could be replicated throughout Somalia if the United States invested a fraction of what it has been spending on special operators and drones. “The budget of the Somali government is comparable to the salary cap for the Washington Nationals baseball team,” he said. “They’re both around $210 million.” He said that less than half that amount would be enough to enable the president to pay the salaries of Somalia National Army recruits and other government employees. That step alone, he says, “would make our investment on the military side more successful.” It would be foolish to try such an intervention in other countries where America is in conflicts. It wouldn’t work, for instance, in Pakistan, where there’s a powerful Islamist presence, a sophisticated military and a history of tensions with the United States. Our experiences in Afghanistan and Iraq—and, years ago, in Vietnam—showed us that American efforts will continually fail if there isn’t a willing local government with the support of the people. But just because those approaches failed in the past doesn’t mean they have to fail in Somalia. Radical Islam takes different forms, and there can be no one-size-fits-all approach to fighting it. In countries where leaders are friendly and ideologies don’t run deep, there may still be an opportunity to build enduring stability. These days, that might be as good a definition of “victory” as we can get. On October 3, 1993, about a hundred elite U.S. soldiers were dropped by helicopter into the teeming market in the heart of Mogadishu, Somalia. Their mission was to abduct two top lieutenants of a Somali warlord and return to base. It was supposed to take an hour. Instead, they found themselves pinned down through a long and terrible night fighting against thousands of heavily armed Somalis. This article is a selection from the January/February issue of Smithsonian magazine
64d169a84d3ecd97aecbe2b0e2c4290f
https://www.smithsonianmag.com/history/legends-what-actually-lived-no-mans-land-between-world-war-i-trenches-180952513/
World War I: 100 Years Later
World War I: 100 Years Later During World War I, No Man’s Land was both an actual and a metaphorical space. It separated the front lines of the opposing armies and was perhaps the only location where enemy troops could meet without hostility. It was in No Man's Land that the spontaneous Christmas truce of December 1914 took place and where opposing troops might unofficially agree to safely remove their wounded comrades, or even sunbathe on the first days of spring. But it could also be the most terrifying of places; one that held the greatest danger for combatants. “Men drowning in shell-holes already filled with decaying flesh, wounded men, beyond help from behind the wire, dying over a number of days, their cries audible, and often unbearable to those in the trenches; sappers buried alive beneath its surface," wrote scholar Fran Brearton in her 2000 history The Great War in Irish Poetry: W.B. Yeats to Michael Longley. No Man’s Land, said poet Wilfred Owen, was “like the face of the moon, chaotic, crater-ridden, uninhabitable, awful, the abode of madness.” In the Oxford English Dictionary, Nomanneslond, ca. 1350, comes from the Middle English, and was “a piece of ground outside the north wall of London, formerly used as a place of execution.” The phrase took on a military connotation as early as 1864, but it became an especially prevalent term during the First World War. The German equivalent was Niemandsland, while the French used the English term le no man’s land. But it was during the Great War that a legend arose out of the real-life horrors that occurred in this wartime hellhole. Part Night of the Living Dead and part War Horse, like all oft-told tales, it had several variants, but the basic kernel warned of scar-faced and fearless deserters banding together from nearly all sides—Australian, Austrian, British, Canadian, French, German, and Italian (though none from the United States)—and living deep beneath the abandoned trenches and dugouts. According to some versions, the deserters scavenged corpses for clothing, food and weapons. And in at least one version, the deserters emerged nightly as ghoulish beasts, to feast upon the dead and dying, waging epic battles over the choicest portions. Historian Paul Fussell called the tale the “finest legend of the war, the most brilliant in literary invention and execution as well as the richest in symbolic suggestion” in his prize-winning 1975 book. Fussell, a professor of English at the University of Pennsylvania who had served as a lieutenant during World War II, knew well the horrors of combat, which he vividly described in his 1989 Wartime. One of the earliest published versions of the “wild deserters” legend appeared in the 1920 memoir The Squadroon by Ardern Arthur Hulme Beaman, a lieutenant colonel in the British cavalry. No other telling of the legend—at least in print—is as horrifying as Beaman’s. Written just two years after the war’s end, Beaman's tale begins in early 1918 on the marshes of the Somme in northern France. This is where some of the bloodiest battles of the war were fought and Beaman is convinced that he has witnessed two dozen or so German prisoners of war vanish into the ground. He wants to send a search party into the maze of abandoned trenches but is advised against it because the area “was peopled with wild men, British, French, Australian, German deserters, who lived there underground, like ghouls among the mouldering dead, and who came out at nights to plunder and to kill. In the night, an officer told him, mingled with the snarling of carrion dogs, they often heard inhuman cries and rifle shots coming from that awful wilderness as though the bestial denizens were fighting among themselves.” In the 1930 novel Behind the Lines (or The Strange Case of Gunner Rawley, its title in the U.S.) by Walter Frederick Morris, who had served in the war as a battalion commander, the protagonist Peter Rawley, a second lieutenant, deserts his Royal Field Artillery unit after killing his company commander. Somewhere on the battlefields of France, Rawley meets up with Alf, another deserter, who leads him underground. “Rawley squeezed through the hole, feet first. He found himself in a low and narrow tunnel, revetted with rotting timbers and half-blocked with falls of earth. . . . The whole place was indescribably dirty and had a musty, earthy, garlicky smell, like the lair of a wild beast. . . . ‘Where do you draw your rations?’ asked Rawley. . . . ‘Scrounge it, [Alf] answered, . . . We live like perishin’ fightin’ cocks sometimes, I give you my word. . . . There’s several of us livin’ round ’ere in these old trenches, mostly working in pairs.” Another gruesome description of wartime outlaws and deserters came in the 1948 five-volume autobiography Laughter in the Next Room by Sir Osbert Sitwell, a fifth baronet and a captain in the Army (he was also the younger brother of the poet Dame Edith Sitwell). In recalling Armistice Day 1918, Sitwell wrote, “For four long years . . . the sole internationalism—if it existed—had been that of deserters from all the warring nations, French, Italian, German, Austrian, Australian, English, Canadian. Outlawed, these men lived—at least, they lived—in caves and grottoes under certain parts of the front line. Cowardly but desperate as the lazzaroni of the old Kingdom of Naples, or the bands of beggars and coney catchers of Tudor times, recognizing no right, and no rules save of their own making, they would issue forth, it was said, from their secret lairs, after each of the interminable checkmate battles, to rob the dying of their few possessions—treasures such as boots or iron rations—and leave them dead.” Sitwell’s concluding note is equally chilling: British troops believed “that the General Staff could find no way of dealing with these bandits until the war was over, and that in the end they [the deserters] had to be gassed.” A more recent literary account comes in 1985 from No Man’s Land by Reginald Hill, author of some 50 novels, many of them police procedurals. The novel begins with Josh Routledge, a British deserter from the Battle of the Somme, and a German soldier-turned-pacifist, Lothar von Seeberg, being chased by mounted military police. Out of almost nowhere, a band of 40 deserters, mostly Australian, attack the military police, and take Josh and Lothar into their dugout. “They were a wild-looking gang, in dirty ragged clothing and with unkempt hair and unshaven faces. They were also very well armed.” In a second instance, these deserters come “swarming out of nowhere, out of the bowels of the earth, that’s how it looked. . . . They was scruffy, dead scruffy. Sort of rugged and wild-looking, more like a bunch of pirates than anything. There was one big brute, nigh on seven foot tall he looked.” The legend seems to have also taken root in modern journalistic accounts. James Carroll in the International Herald Tribune noted in 2006 how World War I deserters refusing to fight “had organized themselves into a kind of third force—not fighters any more, but mere survivors, at home in the caverns. Dozens of them, perhaps hundreds. Human beings caring for one another, no matter what uniform they were wearing.” According to Carroll’s interpretation, these deserters were like angels, taking care of those who had fallen into the safety of the underground caverns—acting as a sane alternative to the insanity of war. The wild deserters of no man’s land, whether angels or devils—or even flesh-eating ghouls who emerge only at night—is the stuff of a legend extremely rich in symbolic value. It reminds us today, a century after it began, of the madness, chaos and senselessness of all the horrors of war. Sorry, we just need to make sure you're not a robot. For best results, please make sure your browser is accepting cookies. Reginald Hill has been widely published both in England and the United States. He received Britain's most coveted mystery writers award, the Cartier Diamond Dagger Award, as well as the Golden Dagger for his Dalziel/Pascoe series. He lives with his wife in Cumbria, England. James Deutsch is a curator at the Smithsonian Center for Folklife and Cultural Heritage, where he has helped develop exhibitions on the Peace Corps, China and World War II, among others. In addition, he serves as an adjunct professor—teaching courses on American film history and folklore—in the American Studies Department at George Washington University.
f1715d3684060444624a5d93c1d43024
https://www.smithsonianmag.com/history/leopold-and-loebs-criminal-minds-996498/
Leopold and Loeb’s Criminal Minds
Leopold and Loeb’s Criminal Minds Nathan Leopold was in a bad mood. That evening, on November 10, 1923, he had agreed to drive with his friend and lover, Richard Loeb, from Chicago to the University of Michigan—a journey of six hours—to burglarize Loeb's former fraternity, Zeta Beta Tau. But they had managed to steal only $80 in loose change, a few watches, some penknives and a typewriter. It had been a big effort for very little reward and now, on the journey back to Chicago, Leopold was querulous and argumentative. He complained bitterly that their relationship was too one-sided: he always joined Loeb in his escapades, yet Loeb held him at arm's length. Eventually Loeb managed to quiet Leopold's complaints with reassurances of his affection and loyalty. And as they continued to drive along the country roads in the direction of Chicago, Loeb started to talk about his idea to carry out the perfect crime. They had committed several burglaries together, and they had set fires on a couple of occasions, but none of their misdeeds had been reported in the newspapers. Loeb wanted to commit a crime that would set all of Chicago talking. What could be more sensational than the kidnapping and murder of a child? If they demanded a ransom from the parents, so much the better. It would be a difficult and complex task to obtain the ransom without being caught. To kidnap a child would be an act of daring—and no one, Loeb proclaimed, would ever know who had accomplished it. Leopold and Loeb had met in the summer of 1920. Both boys had grown up in Kenwood, an exclusive Jewish neighborhood on the South Side of Chicago. Leopold was a brilliant student who matriculated at the University of Chicago at the age of 15. He also earned distinction as an amateur ornithologist, publishing two papers in The Auk, the leading ornithological journal in the United States. His family was wealthy and well connected. His father was an astute businessman who had inherited a shipping company and had made a second fortune in aluminum can and paper box manufacturing. In 1924, Leopold, 19, was studying law at the University of Chicago; everyone expected that his career would be one of distinction and honor. Richard Loeb, 18, also came from a wealthy family. His father, the vice president of Sears, Roebuck & Company, possessed an estimated fortune of $10 million. The third son in a family of four boys, Loeb had distinguished himself early, graduating from University High School at the age of 14 and matriculating later the same year at the University of Chicago. His experience as a student at the university, however, was not a happy one. Loeb's classmates were several years older and he earned only mediocre grades. At the end of his sophomore year, he transferred to the University of Michigan, where he remained a lackluster student who spent more time playing cards and reading dime novels than sitting in the classroom. And he became an alcoholic during his years at Ann Arbor. Nevertheless he managed to graduate from Michigan, and in 1924 he was back in Chicago, taking graduate courses in history at the university. The two teenagers had renewed their friendship upon Loeb's return to Chicago in the fall of 1923. They seemed to have little in common—Loeb was gregarious and extroverted; Leopold misanthropic and aloof—yet they soon became intimate companions. And the more Leopold learned about Loeb, the stronger his attraction for the other boy. Loeb was impossibly good-looking: slender but well built, tall, with brown-blond hair, humorous eyes and a sudden attractive smile; and he had an easy, open charm. That Loeb would often indulge in purposeless, destructive behavior—stealing cars, setting fires and smashing storefront windows—did nothing to diminish Leopold's desire for Loeb's companionship. Loeb loved to play a dangerous game, and he sought always to raise the stakes. His vandalism was a source of intense exhilaration. It pleased him also that he could rely on Leopold to accompany him on his escapades; a companion whose admiration reinforced Loeb's self-image as a master criminal. True, Leopold was annoyingly egotistical. He had an irritating habit of bragging about his supposed accomplishments, and it quickly became tiresome to listen to Leopold's empty, untrue boast that he could speak 15 languages. Leopold also had a tedious obsession with the philosophy of Friedrich Nietzsche. He would talk endlessly about the mythical superman who, because he was a superman, stood outside the law, beyond any moral code that might constrain the actions of ordinary men. Even murder, Leopold claimed, was an acceptable act for a superman to commit if the deed gave him pleasure. Morality did not apply in such a case. Leopold had no objection to Loeb's plan to kidnap a child. They spent long hours together that winter, discussing the crime and planning its details. They decided upon a $10,000 ransom, but how would they obtain it? After much debate they came up with a plan they thought foolproof: they would direct the victim's father to throw a packet containing the money from the train that traveled south of Chicago along the elevated tracks west of Lake Michigan. They would be waiting below in a car; as soon as the ransom hit the ground, they would scoop it up and make good their escape. On the afternoon of May 21, 1924, Leopold and Loeb drove their rental car slowly around the streets of the South Side of Chicago, looking for a possible victim. At 5 o'clock, after driving around Kenwood for two hours, they were ready to abandon the kidnapping for another day. But as Leopold drove north along Ellis Avenue, Loeb, sitting in the rear passenger seat, suddenly saw his cousin, Bobby Franks, walking south on the opposite side of the road. Bobby's father, Loeb knew, was a wealthy businessman who would be able to pay the ransom. He tapped Leopold on the shoulder to indicate they had found their victim. Leopold turned the car in a circle, driving slowly down Ellis Avenue, gradually pulling alongside Bobby. "Hey, Bob," Loeb shouted from the rear window. The boy turned slightly to see the Willys-Knight stop by the curb. Loeb leaned forward, into the front passenger seat, to open the front door. "Hello, Bob. I'll give you a ride." The boy shook his head—he was almost home. "No, I can walk." "Come on in the car; I want to talk to you about the tennis racket you had yesterday. I want to get one for my brother." Bobby had moved closer now. He was standing by the side of the car. Loeb looked at him through the open window. Bobby was so close....Loeb could have grabbed him and pulled him inside, but he continued talking, hoping to persuade the boy to climb into the front seat. Bobby stepped onto the running board. The front passenger door was open, inviting the boy inside...and then suddenly Bobby slid himself into the front seat, next to Leopold. Loeb gestured toward his companion, "You know Leopold, don't you?" Bobby glanced sideways and shook his head—he did not recognize him. "No." "You don't mind [us] taking you around the block?" "Certainly not." Bobby turned around in the seat to face Loeb; he smiled at his cousin with an open, innocent grin, ready to banter about his success in yesterday's tennis game. The car slowly accelerated down Ellis Avenue. As it passed 49th Street, Loeb felt on the car seat beside him for the chisel. Where had it gone? There it was! They had taped up the blade so that the blunt end—the handle—could be used as a club. Loeb felt it in his hand. He grasped it more firmly. At 50th Street, Leopold turned the car left. As it made the turn, Bobby looked away from Loeb and glanced toward the front of the car. Loeb reached over the seat. He grabbed the boy from behind with his left hand, covering Bobby's mouth to stop him from crying out. He brought the chisel down hard—it smashed into the back of the boy's skull. Once again he pounded the chisel into the skull with as much force as possible—but the boy was still conscious. Bobby had now twisted halfway around in the seat, facing back to Loeb, desperately raising his arms as though to protect himself from the blows. Loeb smashed the chisel down two more times into Bobby's forehead, but still he struggled for his life. The fourth blow had gashed a large hole in the boy's forehead. Blood from the wound was everywhere, spreading across the seat, splashed onto Leopold's trousers, spilling onto the floor. It was inexplicable, Loeb thought, that Bobby was still conscious. Surely those four blows would have knocked him out? Loeb reached down and pulled Bobby suddenly upwards, over the front seat into the back of the car. He jammed a rag down the boy's throat, stuffing it down as hard as possible. He tore off a large strip of adhesive tape and taped the mouth shut. Finally! The boy's moaning and crying had stopped. Loeb relaxed his grip. Bobby slid off his lap and lay crumpled at his feet. Leopold and Loeb had expected to carry out the perfect crime. But as they disposed of the body—in a culvert at a remote spot several miles south of Chicago—a pair of eyeglasses fell from Leopold's jacket onto the muddy ground. Upon returning to the city, Leopold dropped the ransom letter into a post box; it would arrive at the Franks house at 8 o'clock the next morning. The following day, a passerby spotted the body and notified the police. The Franks family confirmed the identity of the victim as that of 14-year-old Bobby. The perfect crime had unraveled and now there was no longer any thought, on the part of Leopold and Loeb, of attempting to collect the ransom. By tracing Leopold's ownership of the eyeglasses, the state's attorney, Robert Crowe, was able to determine that Leopold and Loeb were the leading suspects. Ten days after the murder, on May 31, both boys confessed and demonstrated to the state's attorney how they had killed Bobby Franks. Crowe boasted to the press that it would be "the most complete case ever presented to a grand or petit jury" and that the defendants would certainly hang. Leopold and Loeb had confessed and shown the police crucial evidence—the typewriter used for the ransom letter—that linked them to the crime. The trial, Crowe quickly realized, would be a sensation. Nathan Leopold admitted they had murdered Bobby solely for the thrill of the experience. ("A thirst for knowledge is highly commendable, no matter what extreme pain or injury it may inflict upon others," Leopold had told a newspaper reporter. "A 6-year-old-boy is justified in pulling the wings from a fly, if by so doing he learns that without wings the fly is helpless.") The defendants' wealth, their intellectual ability, the high regard within Chicago for their families and the capricious nature of the homicide—everything combined to make the crime one of the most intriguing murders in the history of Cook County. Crowe also realized that he could turn the case to his own advantage. He was 45 years old, yet already he had had an illustrious career as chief justice of the criminal court and, since 1920, as state's attorney of Cook County. Crowe was a leading figure in the Republican Party with a realistic chance of winning election as Chicago's next mayor. To send Leopold and Loeb to the gallows for their murder of a child would, no doubt, find favor with the public. Indeed, the public's interest in the trial was driven by more than lurid fascination with the grisly details of the case. Sometime within the past few years the country had experienced a shift in public morality. Women now bobbed their hair, smoked cigarettes, drank gin and wore short skirts; sexuality was everywhere and young people were eagerly taking advantage of their new freedoms. The traditional ideals—centered on work, discipline and self-denial—had been replaced by a culture of self-indulgence. And what single event could better illustrate the dangers of such a transformation than the heinous murder of Bobby Franks? The evangelical preacher Billy Sunday, passing through Chicago on his way to Indiana, warned that the killing could be "traced to the moral miasma which contaminates some of our ‘young intellectuals.' It is now considered fashionable for higher education to scoff at God....Precocious brains, salacious books, infidel minds—all these helped to produce this murder." But while Crowe could count on the support of an outraged public, he faced a daunting adversary in the courtroom. The families of the confessed murderers had hired Clarence Darrow as defense attorney. By 1894, Darrow had achieved notoriety within Cook County as a clever speaker, an astute lawyer and a champion of the weak and defenseless. One year later, he would become the most famous lawyer in the country, when he successfully defended Socialist labor leader Eugene Debs against conspiracy charges that grew out of a strike against the Pullman Palace Car Company. Crowe could attest firsthand to Darrow's skills. In 1923, Darrow had humiliated him in the corruption trial of Fred Lundin, a prominent Republican politician. Like Crowe, Darrow knew that he might be able to play the trial of Leopold and Loeb to his advantage. Darrow was passionately opposed to the death penalty; he saw it as a barbaric and vengeful punishment that served no purpose except to satisfy the mob. The trial would provide him with the means to persuade the American public that the death penalty had no place in the modern judicial system. Darrow's opposition to capital punishment found its greatest source of inspiration in the new scientific disciplines of the early 20th century. "Science and evolution teach us that man is an animal, a little higher than the other orders of animals; that he is governed by the same natural laws that govern the rest of the universe," he wrote in the magazine Everyman in 1915. Darrow saw confirmation of these views in the field of dynamic psychiatry, which emphasized infantile sexuality and unconscious impulses and denied that human actions were freely chosen and rationally arranged. Individuals acted less on the basis of free will and more as a consequence of childhood experiences that found their expression in adult life. How, therefore, Darrow reasoned, could any individual be responsible for his or her actions if they were predetermined? Endocrinology—the study of the glandular system—was another emerging science that seemed to deny the existence of individual responsibility. Several recent scientific studies had demonstrated that an excess or deficiency of certain hormones produced mental and physical alterations in the afflicted person. Mental illness was closely correlated with physical symptoms that were a consequence of glandular action. Crime, Darrow believed, was a medical problem. The courts, guided by psychiatry, should abandon punishment as futile and in its place should determine the proper course of medical treatment for the prisoner. Such views were anathema to Crowe. Could any philosophy be more destructive of social harmony than Darrow's? The murder rate in Chicago was higher than ever, yet Darrow would do away with punishment. Crime, Crowe believed, would decline only through the more rigorous application of the law. Criminals were fully responsible for their actions and should be treated as such. The stage was set for an epic courtroom battle. Still, in terms of legal strategy, the burden fell heaviest on Darrow. How would he plead his clients? He could not plead them innocent, since both had confessed. There had been no indication that the state's attorney had obtained their statements under duress. Would Darrow plead them not guilty by reason of insanity? Here too was a dilemma, since both Leopold and Loeb appeared entirely lucid and coherent. The accepted test of insanity in the Illinois courts was the inability to distinguish right from wrong and, by this criterion, both boys were sane. On July 21, 1924, the opening day of court, Judge John Caverly indicated that the attorneys for each side could present their motions. Darrow could ask the judge to appoint a special commission to determine if the defendants were insane. The results of an insanity hearing might abrogate the need for a trial; if the commission decided that Leopold and Loeb were insane, Caverly could, on his own initiative, send them to an asylum. It was also possible that the defense would ask the court to try each defendant separately. Darrow, however, already had expressed his belief that the killing was a consequence of each defendant influencing the other. There was no indication, therefore, that the defense would argue for a severance. Nor was it likely that Darrow would ask the judge to delay the start of the trial beyond August 4, its assigned date. Caverly's term as chief justice of the criminal court would expire at the end of August. If the defense requested a continuance, the new chief justice, Jacob Hopkins, might assign a different judge to hear the case. But Caverly was one of the more liberal justices on the court; he had never voluntarily sentenced a defendant to death; and it would be foolish for the defense to request a delay that might remove him from the case. Darrow might also present a motion to remove the case from the Cook County Criminal Court. Almost immediately after the kidnapping, Leopold had driven the rental car across the state line into Indiana. Perhaps Bobby had died outside Illinois and therefore the murder did not fall within the jurisdiction of the Cook County court. But Darrow had already declared that he would not ask for a change of venue and Crowe, in any case, could still charge Leopold and Loeb with kidnapping, a capital offense in Illinois, and hope to obtain a hanging verdict. Darrow chose none of these options. Nine years earlier, in an otherwise obscure case, Darrow had pleaded Russell Pethick guilty of the murder of a 27-year-old housewife and her infant son but had asked the court to mitigate the punishment on account of the defendant's mental illness. Now he would attempt the same strategy in the defense of Nathan Leopold and Richard Loeb. His clients were guilty of murdering Bobby Franks, he told Caverly. Nevertheless he wished the judge to consider three mitigating factors in determining their punishment: their age, their guilty plea and their mental condition. It was a brilliant maneuver. By pleading them guilty, Darrow avoided a trial by jury. Caverly would now preside over a hearing to determine punishment—a punishment that might range from the death penalty to a minimum of 14 years in prison. Clearly it was preferable for Darrow to argue his case before a single judge than before 12 jurors susceptible to public opinion and Crowe's inflammatory rhetoric. Darrow had turned the case on its head. He no longer needed to argue insanity in order to save Leopold and Loeb from the gallows. He now needed only to persuade the judge that they were mentally ill—a medical condition, not at all equivalent or comparable to insanity—to obtain a reduction in their sentence. And Darrow needed only a reduction from death by hanging to life in prison to win his case. And so, during July and August 1924, the psychiatrists presented their evidence. William Alanson White, the president of the American Psychiatric Association, told the court that both Leopold and Loeb had experienced trauma at an early age at the hands of their governesses. Loeb had grown up under a disciplinary regimen so exacting that, in order to escape punishment, he had had no other recourse but to lie to his governess, and so, in White's account at least, he had been set on a path of criminality. "He considered himself the master criminal mind of the century," White testified, "controlling a large band of criminals, whom he directed; even at times he thought of himself as being so sick as to be confined to bed, but so brilliant and capable of mind...[that] the underworld came to him and sought his advice and asked for his direction." Leopold also had been traumatized, having been sexually intimate with his governess at an early age. Other psychiatrists—William Healy, the author of The Individual Delinquent, and Bernard Glueck, professor of psychiatry at the New York Postgraduate School and Hospital—confirmed that both boys possessed a vivid fantasy life. Leopold pictured himself as a strong and powerful slave, favored by his sovereign to settle disputes in single-handed combat. Each fantasy interlocked with the other. Loeb, in translating his fantasy of being a criminal mastermind into reality, required an audience for his misdeeds and gladly recruited Leopold as a willing participant. Leopold needed to play the role of the slave to a powerful sovereign—and who, other than Loeb, was available to serve as Leopold's king? Crowe had also recruited prominent psychiatrists for the prosecution. They included Hugh Patrick, president of the American Neurological Association; William Krohn and Harold Singer, authors of Insanity and the Law: A Treatise on Forensic Psychiatry; and Archibald Church, professor of mental diseases and medical jurisprudence at Northwestern University. All four testified that neither Leopold nor Loeb displayed any sign of mental derangement. They had examined both prisoners in the office of the state's attorney shortly after their arrest. "There was no defect of vision," Krohn testified, "no defect of hearing, no evidence of any defect of any of the sense paths or sense activities. There was no defect of the nerves leading from the brain as evidenced by gait or station or tremors." Each set of psychiatrists—one for the state, the other for the defense—contradicted the other. Few observers noticed that each side spoke for a different branch of psychiatry and was, therefore, separately justified in reaching its verdict. The expert witnesses for the state, all neurologists, had found no evidence that any organic trauma or infection might have damaged either the cerebral cortex or the central nervous system of the defendants. The conclusion reached by the psychiatrists for the prosecution was, therefore, a correct one—there was no mental disease. The psychiatrists for the defense—White, Glueck and Healy—could assert, with equal justification, that, according to their understanding of psychiatry, an understanding informed by psychoanalysis, the defendants had suffered mental trauma during childhood that had damaged each boy's ability to function competently. The result was compensatory fantasies that had led directly to the murder. Most commentators, however, were oblivious to the epistemological gulf that separated neurology from psychoanalytic psychiatry. The expert witnesses all claimed to be psychiatrists, after all; and it was, everyone agreed, a dark day for psychiatry when leading representatives of the profession could stand up in court and contradict each other. If men of national reputation and eminence could not agree on a common diagnosis, then could any value be attached to a psychiatric judgment? Or perhaps each group of experts was saying only what the lawyers required them to say—for a fee, of course. It was an evil that contaminated the entire profession, thundered the New York Times, in an editorial similar to dozens of others during the trial. The experts in the hearing were "of equal authority as alienists and psychiatrists," apparently in possession of the same set of facts, who, nevertheless, gave out "opinions exactly opposite and contradictory as to the past and present condition of the two prisoners.... Instead of seeking truth for its own sake and with no preference as to what it turns out to be, they are supporting, and are expected to support, a predetermined purpose....That the presiding Judge," the editorial writer concluded sorrowfully, "is getting any help from those men toward the forming of his decision hardly is to be believed." At 9:30 on the morning of September 10, 1924, Caverly prepared to sentence the prisoners. The final day of the hearing was to be broadcast live over station WGN, and throughout the city, groups of Chicagoans clustered around radio sets to listen. The metropolis had paused in its morning bustle to hear the verdict. Caverly's statement was brief. In determining punishment, he gave no weight to the guilty plea. Normally a guilty plea could mitigate punishment if it saved the prosecution the time and trouble of demonstrating culpability; but that had not been the case on this occasion. The psychiatric evidence also could not be considered in mitigation. The defendants, Caverly stated, "have been shown in essential respects to be abnormal....The careful analysis made of the life history of the defendants and of their present mental, emotional and ethical condition has been of extreme interest....And yet the court feels strongly that similar analyses made of other persons accused of crime would probably reveal similar or different abnormalities....For this reason the court is satisfied that his judgment in the present case cannot be affected thereby." Nathan Leopold and Richard Loeb had been 19 and 18 years old, respectively, at the time of the murder. Did their youth mitigate the punishment? The prosecuting attorneys, in their concluding statements to the court, had emphasized that many murderers of similar age had been executed in Cook County; and none had planned their deeds with as much deliberation and forethought as Leopold and Loeb. It would be outrageous, Crowe had argued, for the prisoners to escape the death penalty when others—some even younger than 18—had been hanged. Yet, Caverly decided he would hold back from imposing the extreme penalty on account of the age of the defendants. He sentenced each defendant to 99 years for the kidnapping and life in prison for the murder. "The court believes," Caverly stated, "that it is within his province to decline to impose the sentence of death on persons who are not of full age. This determination appears to be in accordance with the progress of criminal law all over the world and with the dictates of enlightened humanity." The verdict was a victory for the defense, a defeat for the state. The guards allowed Leopold and Loeb to shake Darrow's hand before escorting the prisoners back to their cells. Two dozen reporters crowded around the defense table to hear Darrow's response to the verdict and, even in his moment of victory, Darrow was careful not to seem too triumphal: "Well, it's just what we asked for but...it's pretty tough." He pushed back a lock of hair that had fallen over his forehead, "It was more of a punishment than death would have been." Crowe was furious at the judge's decision. In his statement to the press, Crowe made sure everyone knew whom to blame: "The state's attorney's duty was fully performed. He is in no measure responsible for the decision of the court. The responsibility for that decision rests with the judge alone." Later that evening, however, Crowe's rage would emerge in full public view, when he issued another, more inflammatory statement: "[Leopold and Loeb] had the reputation of being immoral...degenerates of the worst type....The evidence shows that both defendants are atheists and followers of the Nietzschean doctrines...that they are above the law, both the law of God and the law of man....It is unfortunate for the welfare of the community that they were not sentenced to death." As for Nathan Leopold and Richard Loeb, their fates would take divergent paths. In 1936, inside Stateville Prison, James Day, a prisoner serving a sentence for grand larceny, stabbed Loeb in the shower room and despite the best efforts of the prison doctors, Loeb, then 30 years old, died of his wounds shortly afterward. Leopold served 33 years in prison until he won parole in 1958. At the parole hearing, he was asked whether he realized that every media outlet in the country would want an interview with him. Already there was a rumor that Ed Murrow, the CBS correspondent, wanted him to appear on his television show "See It Now." "I don't want any part of lecturing, television or radio, or trading on the notoriety," Leopold replied. The confessed murderer who had once deemed himself a superman stated, "All I want, if I am so lucky as to ever see freedom again, is to try to become a humble little person." Upon his release, Leopold moved to Puerto Rico, where he lived in relative obscurity, studying for a degree in social work at the University of Puerto Rico, writing a monograph on the birds of the island, and, in 1961, marrying Trudi Garcia de Quevedo, the expatriate widow of a Baltimore physician. During the 1960s, Leopold was finally able to travel to Chicago. He returned to the city often, to see old friends, to tour the South Side neighborhood near the university and to place flowers on the graves of his mother and father and two brothers. It had been so long ago—that summer of 1924, in the stuffy courtroom on the sixth floor of the Cook County Criminal Court—and now he was the sole survivor. The crime had passed into legend; its thread had been woven into the tapestry of Chicago's past; and when Nathan Leopold, at age 66, died in Puerto Rico of a heart attack on August 29, 1971, the newspapers wrote of the murder as the crime of the century, an event so inexplicable and so shocking that it would never be forgotten. © 2008 by Simon Baatz, adapted from For the Thrill of It: Leopold, Loeb, and the Murder that Shocked Chicago, published by HarperCollins.
c4bfdee52e417e13a6ac2ecd6dbd96e8
https://www.smithsonianmag.com/history/letters-mothers-president-lincoln-180951395/
Letters from Mothers to President Lincoln
Letters from Mothers to President Lincoln In the summer of 1818, when Abraham Lincoln was nine years old, his mother, Nancy, caught “the milk-sick,” a then-mysterious disease caused by drinking the milk of cows that had eaten white snakeroot. (We know it today as brucellosis.) Her breath grew shorter, her skin turned sallow and cold, her pulse faded and slowed. Within a week she was dead. In adulthood, Lincoln confided to a friend about how lonely he felt in the months afterward, and how he found solace in the Bible stories his mother had told him; the words restored her voice to his mind’s ear. “All that I am, or hope to be,” he said, “I owe to my angel mother.” Doubtless Lincoln thought of his mother when he received letters from women whose sons were fighting in the Civil War. In honor of Nancy Lincoln—and American mothers from every century—we sample below motherly missives to the president. Letters have been edited for length, but retain their original spelling and grammar. *** President U States Hon. A Lincoln Dear Sir Will you excuse my daring to address you, and enclosing this petition for my eldest son, for, your kind consideration. It will tell you all I need, and allow me to say a few words. I know you will listen to them for you have a kind heart, and my story is a sad one. I am a widow left with only these two sons, who have both left me, to fight for the good cause and I am proud to send them forth allthough they leave me desolate, and, heart broken, as they were all I had, for my support, and were my only hope in this world, but I have given them up, but trust in God's mercy, to return them to me, some day. My eldest son is First Lieut in the 15th Regiment, and educated for the army wishes a permanent place in it my youngest son, is a Private soldier in Gen Duryea' 5th Regiment Advance Guards, now at Fort Monroe. he is a druggist by Proffesion and almost a Phycian. he was my only stay, because the youngest and to have him perhaps forever taken away from me almost kills me. My health is extremely delicate and if he could only have a higher place than a private in the Regiment, would make me feel better if he could assist in the Medical Staff in the Hospital, perhaps I am wild to ask such things but I know you can do all things.… Dont dear Mr Lincoln refuse to listen to a Widow'ed Mother prayer. Will you look favorable on this petition. Let me ask your forgiveness for trespassing but you will excuse a broken hearted woman. Cornelia Ludlow Beekman July 1861 *** To: The Hon. Pres. A Lincoln I humbly pray you to pardon my son Benjamin F Stevens who is under arrest & probably sentenced for going to sleep on guard in the 49th regt Indiana Vols.… He is but sixteen years of age. I humbly ever pray Mrs. Eliza J Stevens Seymour, Indiana April 1862 *** Excellent Sir My good friend says I must write to you and she will send it. My son went in the 54th [Massachusetts] regiment.  I am a colored woman and my son was strong and able as any to fight for his country and the colored people have as much to fight for as any. My father was a Slave and escaped from Louisiana before I was born morn forty years agone  I have but poor edication but I never went to schol, but I know just as well as any what is right between man and man.  Now I know it is right that a colored man should go and fight for his country, and so ought to a white man. I know that a colored man ought to run no greater risques than a white, his pay is no greater his obligation to fight is the same. So why should not our enemies be compelled to treat him the same, Made to do it. My son fought at Fort Wagoner but thank God he was not taken prisoner, as many were  I thought of this thing before I let my boy go but then they said Mr. Lincoln will never let them sell our colored soldiers for slaves,  if they do he will get them back quck he will rettallyate and stop it. Now Mr Lincoln dont you think you oght to stop this thing and make them do the same by the colored men  they have lived in idleness all their lives on stolen labor and made savages of the colored people, but they now are so furious because they are proving themselves to be men, such as have come away and got some edication. It must not be so. You must put the rebels to work in State prisons to making shoes and things, if they sell our colored soldiers, till they let them all go. And give their wounded the same treatment. it would seem cruel, but their no other way, and a just man must do hard things sometimes, that shew him to be a great man.  They tell me some do you will take back the [Emancipation] Proclamation,  don't do it.  When you are dead and in Heaven, in a thousand years that action of yours will make the Angels sing your praises I know it…. Will you see that the colored men fighting now, are fairly treated.  You ought to do this, and do it at once, Not let the thing run along   meet it quickly and manfully, and stop this, mean cowardly cruelty.  We poor oppressed ones, appeal to you, and ask fair play.  Yours for Christs sake Hannah Johnson Buffalo, New York July 1863 *** Sir, I have as you know, a son, an only and most dearly loved son, in the Southern Army; and I know, am well assured that if I can reach Richmond I shall be enabled to procure for him an honorable discharge from the army and an opportunity of being once more united (in a foreign land) to his mother and child. I ask you now for the permit to go south, and oh—Mr Lincoln by the love you bear to your dear ones who are yet spared to you, as well as that you bear for those, that whom God has called to await you in another and a happier world, grant my petition. Let me go, and if I should fail in the main object of my journey—still I shall once more see my child face to face, and his little boy, may take away a memory of his father, which otherwise he may never have. You may trust my honor, for taking nothing contraband, nor compromising my government by letter or word of mouth— Yield to my entreaties and receive the ever grateful remembrance of Yours Respectfully Harriette B. Prentice Louisville, Kentucky January 1864 *** dear Sir! Permit me the honor of an interview, with your excellency. I have ventured again alone my Husband’s official duties debar him from accompaning me. Though my errand in behalf of my Son in-law Capt. John D. O’Connell of the Regular Army—necessitates immediate attention—The Captain’s situation likewise compells his constant presence, where he is in comd. over the Recruit.g Service of the 14th Inft.y at Fort Trumbull New London, Conn. and where he has himself been recruiting his own health, from serious wounds, and I am most happy to inform your Excellency, that I had the pleasure of discharging him, my self from his bandages leather straps! and new set of teeth fills the void, made by a horse's foot, which almost dismembered his upper lip— Having been wedged under his dead horse shot under him, but before the fatal ball which did this mischief, it was first crimsoned by passing through the knee of its rider.… While he laid helpless from loss of blood, and wedged under his dead horse, another horse frantic from pain having been shot, plunged over him planting his fore foot over his upper lip from the base of the nose, completely carrying it away from the face, which hung to the cheek by a small fragment of flesh knocking out all his front teeth, out. When I journed to meet him, he presented a pitiful sight— But after careful watching and constant attention, my noble and daring Son-in-law is now ready to resume Field duties— again— He is not discouragd by his experience in defence of his Flag — And is ready to front the enemy — as soon as he is permitted His younger brother whom I equiped for the Field was killd in battle, with two of my nephews! All three young Lieuts. Brave Boys! I glory to claim them my own dear flesh & blood—and I am proud to inform your Excellency that I still am honored by having yet three more nephews at this hour, on Field duty. And my errand is to put another in the field yet more nearer me still, my only Son, whose prayer to me is to get him also in the Army he is now twenty one years old and craves a commission to some Regiment. He is now on field duty, in capacity of clerkship. Left college to serve his country. I am a Stranger here, and if required to be formally presented I really know not to whom I can call upon… Please honor me with a line if it be possible I can call on your excellency, and when? Not with the crowd but alone, as I will be alone with my little Daughter. I have the honor to remain your Excellency's humble Servt Mrs Col. Martin Burke Washington, D.C. February 1864 *** Our Most worthey presedent please Excuase Me for takeing this Liberty But I Cannot Express my Grate gratitude for your kindness in granting Me the order for My Son john H Bowden’s of Chicago discharge what Goverment Bounty he has receved I have that Unbroken to refuned But the 1 hundred Dollers County Bounty I have Not Got It as I had to Use it Last winter to Maintain My Sick Boy and a dependant Sister I have Bin a widow Eleven years My Oldest Son a Loosing his health on Cheat Mountain Makes it Vary Bad for Us our kind president If you Can releave Me So that I Can take My Boy home with me I feel that God will reward you and I No he will Bless all your Undertakings please Answare Respectifully Mrs Ann Bowden Washington, D.C. June 1864 *** On the first of this month my Son Eugene N. C. Promie, aged 17 years old, with two other lads were enticed by two men, offering them situations to Learn the Engineering in the United States navy, being taken to New York against my Will or consent, after arriving there they were forced into A Carriage, taken to Williamsburg to the Provost Marshall's Office, and there Sold as Substitutes in the Army (the men I learn having made Nineteen hundred dollars by the act) and immediatly conveyed to Hart Island and from there sent to the front, his Father being in New York at the time my Son desired to see him, to get his consent, as that was the provision made, but was not allowed, but was forced away as before stated by threats, the Men are now in Prison for Abduction. My poor boy I have just received A Letter from who is now in the Chesapeak Hospital Sick and expected to be sent away; My Dear Boy is just from College inexperienced and but a Child And Oh! let not the appeal of a Mothers Grief be in Vain I am unable through my distressed feelings to dictate to you A more appealing letter… not the appeal of A Sorrowful Mother be in Vain… Hoping the Prayer of A Mother may be heard through you and my Son restored to me I remain your Esteemed Friend Amanda A Promie Philadelphia June 1864 *** Mr Lincoln Allow me to congratulate you on your re-election. It is certainly a very great compliment to be invited to preside over the destinies of a great nation—a second term…. You have never refused me any thing I have asked—I hope I have not been unreasonable—or imposed on your naturally kind benevolent disposition.—I have a young son—Lemuel S. Hardin— who has been a short time in the Southern Army—has been severely wounded—he has made his way through the lines—and is now in Canada—He is crippled for life—and is anxious to return to his home and family—He has been a resident for the last three years in Louisville Ky. … After a young man has—“sown his wild oats”—or—“seen the elephant”—he is often better prepared to settle down and become a sensible man—he has a better appreciation of home and the advantage of a good position. Mr President—I claim your indulgance in favor of my petition—not on the merit of the case—but as an act of clemency to a wayward youth— My waif of a son is endowed with many of the good qualities of the noble man from which he comes—both of head and heart. Yours respectfully—S E. Walworth December 1864 *** To his Excellency Abraham Lincoln: Sir, A sick and almost heartbroken mother again sentenced to make another appeal to you for the release of her dear son, Samuel Hardinge Jr., who, through gross misrepresentation and exaggeration on the part of enemies, was first confined in Carroll Prison; and afterward, without being allowed to vindicate his own innocence, transferred to Fort Delaware. [Hardinge was the husband of Belle Boyd, a Confederate spy.] In the only letters which I have received from him since he has been there, he thus writes: “Oh My God! How long am I to remain in this horrible place, full of rebels and secessionists. Oh my parents! Do all you possibly can to get me out of here. My God! My poor wife in England! She tells me in a letter—“For God’s sake to send her some money!” And I in prison! Why should they put me in here! I who have taken the oath of allegiance to the U.S. Government and who have never done anything against it. Oh it is hard! And I pray God faily and nightly that President Lincoln may grant my release!” I transcribe his own words that you may see what his real feelings are. I told you, sir, in my recent interview with you, that he might, so far as I know, have been guilty of some small utterances, smarting as he was under the unfair and cruel suspicions cast upon him in the affair of the “Greyhound”; but, guilty of a single act against the good of his country—never! You, sir, can judge for yourself whether or not this is the language of a foe to the Government. Oh President Lincoln! I implore and entreat of you to grant my son’s release! My health is rapidly failing under this dreadful blow! I appeal to your kindly nature!... When you think of the magnificent glorious Christmas gift which General Sherman presented to you, will you not confer upon a poor heartbroken mother, the—to you, small—News Years gift of the liberty of her dear son. Sarah A. M. Hardinge Brooklyn, New York January 1865 *** Honble Abraham Lincoln President of the U.S. America I have heard from good authority that if I suppress the Book I have now ready for publication, you may be induced to consider leniently the case of my husband, S. Wylde Hardinge, now a prisoner in Fort Delaware, I think it would be well for you & me to come to some definite understanding. My Book was originally not intended to be more than a personal narrative, but since my husband’s unjust arrest I had intended making it political, & had introduced many atrocious circumstances respecting your government with which I am so well acquainted & which would open the eyes of Europe to many things of which the world on this side of the water little dreams. If you will release my husband & set him free, so that he may join me here in England by the beginning of March—I pledge you my word that my Book shall be suppressed. Should my husband not be with me by the 25th of March I shall at once place my Book in the hands of a publisher. Trusting an immediate reply, I am Sir, Yr. Obdt. Sevt. Belle Boyd Hardinge England January 1865 Lincoln made no notation on Belle’s letter, nor did he indicate any knowledge of the “atrocious circumstances” to which she referred. Perhaps because the war was almost over, perhaps because Samuel Hardinge’s only crime was being Belle’s husband, perhaps because the president admired the rebel girl’s audacity, the prisoner was released on February 3, ten days after Belle made her demand. She would name her baby daughter Grace, and, later, her son Arthur Davis Lee Jackson, after her favorite Confederate heroes. Sources: Books: Michael Burlingame. The Inner World of Abraham Lincoln. Urbana: University of Illinois Press, 1994; Harold Holzer. Dear Mr. Lincoln: Letters to the President. Reading, MA: Addison-Wesley, 1993. Articles: Louis A. Sigaud. “When Belle Boyd Wrote Lincoln.” Lincoln Herald, Vol. 50 (February 1948). Online: The Abraham Lincoln Papers at the Library of Congress: http://memory.loc.gov/ammem/alhtml/malhome.html. Karen Abbott is a contributing writer for history for Smithsonian.com and the author of the books Sin in the Second City and American Rose. Her forthcoming book, Liar, Temptress, Soldier, Spy, will be published by HarperCollins in September.
c93e22c62d0686ad31f6b820cbc832de
https://www.smithsonianmag.com/history/lincoln-as-commander-in-chief-131322819/
Lincoln as Commander in Chief
Lincoln as Commander in Chief When the American Civil War began, president Abraham Lincoln was far less prepared for the task of commander in chief than his Southern adversary. Jefferson Davis had graduated from West Point (in the lowest third of his class, to be sure), commanded a regiment that fought intrepidly at Buena Vista in the Mexican War and served as secretary of war in the Franklin Pierce administration from 1853 to 1857. Lincoln's only military experience had come in 1832, when he was captain of a militia unit that saw no action in the Black Hawk War, which began when Sac and Fox Indians (led by the war chief Black Hawk) tried to return from Iowa to their ancestral homeland in Illinois in alleged violation of a treaty of removal they had signed. During Lincoln's one term in Congress, he mocked his military career in an 1848 speech. "Did you know I am a military hero?" he said. "I fought, bled and came away" after "charges upon the wild onions" and "a good many bloody struggles with the Musquetoes." [×] CLOSE Video: Mathew Brady's Vision When he called state militia into federal service on April 15, 1861—following the Confederate bombardment of Fort Sumter—Lincoln therefore faced a steep learning curve as commander in chief. He was a quick study, however; his experience as a largely self-taught lawyer with a keen analytical mind who had mastered Euclidean geometry for mental exercise enabled him to learn quickly on the job. He read and absorbed works on military history and strategy; he observed the successes and failures of his own and the enemy's military commanders and drew apt conclusions; he made mistakes and learned from them; he applied his large quotient of common sense to slice through the obfuscations and excuses of military subordinates. By 1862 his grasp of strategy and operations was firm enough almost to justify the overstated but not entirely wrong conclusion of historian T. Harry Williams: "Lincoln stands out as a great war president, probably the greatest in our history, and a great natural strategist, a better one than any of his generals." As president of the nation and leader of his party as well as commander in chief, Lincoln was principally responsible for shaping and defining national policy. From first to last, that policy was the preservation of the United States as one nation, indivisible, and as a republic based on majority rule. Although Lincoln never read Karl von Clausewitz's famous treatise On War, his actions were a consummate expression of Clausewitz's central argument: "The political objective is the goal, war is the means of reaching it, and means can never be considered in isolation from their purpose. Therefore, it is clear that war should never be thought of as something autonomous but always as an instrument of policy." Some professional military commanders tended to think of war as "something autonomous" and deplored the intrusion of political considerations into military matters. Take the notable example of "political generals." Lincoln appointed many prominent politicians with little or no military training or experience to the rank of brigadier or major general. Some of them received these appointments so early in the war that they subsequently outranked professional, West Point–educated officers. Lincoln also commissioned important ethnic leaders as generals with little regard to their military merits. Historians who deplore the abundance of political generals sometimes cite an anecdote to mock the process. One day in 1862, the story goes, Lincoln and Secretary of War Edwin M. Stanton were going over a list of colonels for promotion to brigadier general. Coming to the name of Alexander Schimmelfennig, the president said that "there has got to be something done unquestionably in the interest of the Dutch, and to that end I want Schimmelfennig appointed." Stanton protested that there were better-qualified German-Americans. "No matter about that," Lincoln supposedly said, "his name will make up for any difference there may be." General Schimmelfennig is remembered today mainly for hiding for three days in a woodshed next to a pigpen to escape capture at Gettysburg. Other political generals are also remembered more for their military defeats or blunders than for any positive achievements. Often forgotten are the excellent military records of some political generals like John A. Logan and Francis P. Blair (among others). And some West Pointers, notably Ulysses S. Grant and William T. Sherman, might have languished in obscurity had it not been for the initial sponsorship of Grant by Congressman Elihu B. Washburne and of Sherman by his brother John, a U.S. senator. Even if all political generals, or generals in whose appointments politics played a part, turned out to have mediocre military records, however, the process would have had a positive impact on national strategy by mobilizing their constituencies for the war effort. On the eve of the war, the U.S. Army consisted of approximately 16,400 men, of whom about 1,100 were commissioned officers. Of these, some 25 percent resigned to join the Confederate army. By April 1862, when the war was a year old, the volunteer Union army had grown to 637,000 men. This mass mobilization could not have taken place without an enormous effort by local and state politicians as well as by prominent ethnic leaders. Another important issue that began as a question of national strategy eventually crossed the boundary to become policy as well. That was the issue of slavery and emancipation. During the war's first year, one of Lincoln's top priorities was to keep border state Unionists and Northern antiabolitionist Democrats in his war coalition. He feared, with good reason, that the balance in three border slave states might tip to the Confederacy if his administration stepped prematurely toward emancipation. When Gen. John C. Frémont issued a military order freeing the slaves of Confederate supporters in Missouri, Lincoln revoked it in order to quell an outcry from the border states and Northern Democrats. To sustain Frémont's order, Lincoln believed, "would alarm our Southern Union friends, and turn them against us—perhaps ruin our rather fair prospect for Kentucky....I think that to lose Kentucky is nearly the same as to lose the whole game. Kentucky gone, we can not hold Missouri, nor as I think, Maryland. These all against us, and the job on our hands is too large for us. We would as well consent to separation at once, including the surrender of this capitol." During the next nine months, however, the thrust of national strategy shifted away from conciliating the border states and anti-emancipation Democrats. The antislavery Republican constituency grew louder and more demanding. The argument that slavery had brought on the war and that reunion with slavery would only sow the seeds of another war became more insistent. The evidence that slave labor sustained the Confederate economy and the logistics of Confederate armies grew stronger. Counteroffensives by Southern armies in the summer of 1862 wiped out many of the Union gains of the winter and spring. Many northerners, including Lincoln, became convinced that bolder steps were necessary. To win the war over an enemy fighting for and sustained by slavery, the North must strike at slavery. In July 1862, Lincoln decided on a major change in national strategy. Instead of deferring to the border states and Northern Democrats, he would activate the Northern antislavery majority that had elected him and mobilize the potential of black manpower by issuing a proclamation of freedom for slaves in rebellious states—the Emancipation Proclamation. "Decisive and extreme measures must be adopted," Lincoln told members of his cabinet, according to Secretary of the Navy Gideon Welles. Emancipation was "a military necessity, absolutely necessary to the preservation of the Union. We must free the slaves or be ourselves subdued." By trying to convert a Confederate resource to Union advantage, emancipation thus became a crucial part of the North's national strategy. But the idea of putting arms in the hands of black men provoked even greater hostility among Democrats and border state Unionists than emancipation itself. In August 1862, Lincoln told delegates from Indiana who offered to raise two black regiments that "the nation could not afford to lose Kentucky at this crisis" and that "to arm the negroes would turn 50,000 bayonets from the loyal border States against us that were for us." Three weeks later, however, the president quietly authorized the War Department to begin organizing black regiments on the South Carolina Sea Islands. And by March 1863, Lincoln had told his military governor of occupied Tennessee that "the colored population is the great available and yet unavailed of, force for restoring the Union. The bare sight of fifty thousand armed, and drilled black soldiers on the banks of the Mississippi, would end the rebellion at once. And who doubts that we can present that sight, if we but take hold in earnest." This prediction proved overoptimistic. But in August 1863, after black regiments had proved their worth at Fort Wagner and elsewhere, Lincoln told opponents of their employment that in the future "there will be some black men who can remember that, with silent tongue, and clenched teeth, and steady eye, and well-poised bayonet, they have helped mankind on to this great consummation; while, I fear, there will be some white ones, unable to forget that, with malignant heart, and deceitful speech, they have strove to hinder it." Lincoln also took a more active, hands-on part in shaping military strategy than presidents have done in most other wars. This was not necessarily by choice. Lincoln's lack of military training inclined him at first to defer to General in Chief Winfield Scott, America's most celebrated soldier since George Washington. But Scott's age (75 in 1861), poor health and lack of energy placed a greater burden on the president. Lincoln was also disillusioned by Scott's March 1861 advice to yield both Forts Sumter and Pickens. Scott's successor, Gen. George B. McClellan, proved an even greater disappointment to Lincoln. In early December 1861, after McClellan had been commander of the Army of the Potomac for more than four months and had done little with it except conduct drills and reviews, Lincoln drew on his reading and discussions of military strategy to propose a campaign against Confederate Gen. Joseph E. Johnston's army, then occupying the Manassas-Centreville sector 25 miles from Washington. Under Lincoln's plan, part of the Army of the Potomac would feign a frontal attack while the rest would use the Occoquan Valley to move up on the flank and rear of the enemy, cut its rail communications and catch it in a pincer movement. It was a good plan; indeed it was precisely what Johnston most feared. McClellan rejected it in favor of a deeper flanking movement all the way south to Urbana on the Rappahannock River. Lincoln posed a series of questions to McClellan, asking him why his distant-flanking strategy was better than Lincoln's short-flanking plan. Three sound premises underlay Lincoln's questions: first, the enemy army, not Richmond, should be the objective; second, Lincoln's plan would enable the Army of the Potomac to operate near its own base (Alexandria) while McClellan's plan, even if successful, would draw the enemy back toward his base (Richmond) and lengthen the Union supply line; and third, "does not your plan involve a greatly larger expenditure of time...than mine?" McClellan brushed off Lincoln's questions and proceeded with his own plan, bolstered by an 8–4 vote of his division commanders in favor of it, which caused Lincoln reluctantly to acquiesce. Johnston then threw a monkey wrench into McClellan's Urbana strategy by withdrawing from Manassas to the south bank of the Rappahannock—in large part to escape the kind of maneuver Lincoln had proposed. McClellan now shifted his campaign all the way to the Virginia peninsula between the York and James rivers. Instead of attacking a line held by fewer than 17,000 Confederates near Yorktown with his own army, then numbering 70,000, McClellan, in early April, settled down for a siege that would give Johnston time to bring his whole army down to the peninsula. An exasperated Lincoln telegraphed McClellan on April 6: "I think you better break the enemies' line from York-town to Warwick River, at once. They will probably use time, as advantageously as you can." McClellan's only response was to comment petulantly in a letter to his wife that "I was much tempted to reply that he had better come & do it himself." In an April 9 letter to the general, Lincoln enunciated another major theme of his military strategy: the war could be won only by fighting the enemy rather than by endless maneuvers and sieges to occupy places. "Once more," wrote Lincoln, "let me tell you, it is indispensable to you that you strike a blow. You will do me the justice to remember I always insisted, that going down the Bay in search of a field, instead of fighting at or near Manassas, was only shifting, and not surmounting, a difficulty—that we would find the same, or equal intrenchments, at either place. The country will not fail to note—is now noting—that the present hesitation to move upon an intrenched enemy, is but the story of Manassas repeated." But the general who acquired the nickname of Tardy George never learned that lesson. The same was true of several other generals who did not live up to Lincoln's expectations. They seemed to be paralyzed by responsibility for the lives of their men as well as the fate of their army and nation. This intimidating responsibility made them risk-averse. This behavior especially characterized commanders of the Army of the Potomac, who operated in the glare of media publicity with the government in Washington looking over their shoulders. In contrast, officers like Ulysses S. Grant, George H. Thomas and Philip H. Sheridan got their start in the western theater hundreds of miles distant, where they worked their way up from command of a regiment step by step to larger responsibilities away from media attention. They were able to grow into these responsibilities and to learn the necessity of taking risks without the fear of failure that paralyzed McClellan. Meanwhile, Lincoln's frustration with the lack of activity in the Kentucky-Tennessee theater had elicited from him an important strategic concept. Generals Henry W. Halleck and Don C. Buell commanded in the two western theaters separated by the Cumberland River. Lincoln urged them to cooperate in a joint campaign against the Confederate army defending a line from eastern Kentucky to the Mississippi River. Both responded in early January 1862 that they were not yet ready. "To operate on exterior lines against an enemy occupying a central position will fail," wrote Halleck. "It is condemned by every military authority I have ever read." Halleck's reference to "exterior lines" described the conundrum of an invading or attacking army operating against an enemy that holds a defensive perimeter resembling a semi-circle—the enemy enjoys the advantage of "interior lines" that enables it to shift reinforcements from one place to another within that arc. By this time Lincoln had read some of those authorities (including Halleck) and was prepared to challenge the general's reasoning. "I state my general idea of the war," he wrote to both Halleck and Buell, "that we have the greater numbers, and the enemy has the greater facility of concentrating forces upon points of collision; that we must fail, unless we can find some way to making our advantage an over-match for his; and that this can be only done by menacing him with superior forces at different points, at the same time; so that we can safely attack, one, or both, if he makes no change; and if he weakens one to strengthen the other, forbear to attack the strengthened one, but seize and hold the weakened one, gaining so much." Lincoln clearly expressed here what military theorists define as "concentration in time" to counter the Confederacy's advantage of interior lines that enabled Southern forces to concentrate in space. The geography of the war required the North to operate generally on exterior lines while the Confederacy could use interior lines to shift troops to the point of danger. By advancing on two or more fronts simultaneously, Union forces could neutralize this advantage, as Lincoln understood but Halleck and Buell seemed unable to grasp. Not until Grant became general in chief in 1864 did Lincoln have a commander in place who would carry out this strategy. Grant's policy of attacking the enemy wherever he found it also embraced Lincoln's strategy of trying to cripple the enemy as far from Richmond (or any other base) as possible rather than maneuvering to occupy or capture places. From February to June 1862, Union forces had enjoyed remarkable success in capturing Confederate territory and cities along the south Atlantic coast and in Tennessee and the lower Mississippi Valley, including the cities of Nashville, New Orleans and Memphis. But Confederate counteroffensives in the summer recaptured much of this territory (though not these cities). Clearly, the conquest and occupation of places would not win the war so long as enemy armies remained capable of reconquering them. Lincoln viewed these Confederate offensives more as an opportunity than a threat. When the Army of Northern Virginia began to move north in the campaign that led to Gettysburg, Gen. Joseph Hooker proposed to cut in behind the advancing Confederate forces and attack Richmond. Lincoln rejected the idea. "Lee's Army, and not Richmond, is your true objective point," he wired Hooker on June 10, 1863. "If he comes toward the Upper Potomac, follow on his flank, and on the inside track, shortening your [supply] lines, whilst he lengthens his. Fight him when opportunity offers." A week later, as the enemy was entering Pennsylvania, Lincoln told Hooker that this invasion "gives you back the chance that I thought McClellan lost last fall" to cripple Lee's army far from its base. But Hooker, like McClellan, complained (falsely) that the enemy outnumbered him and failed to attack while Lee's army was strung out for many miles on the march. Hooker's complaints compelled Lincoln to replace him on June 28 with George Gordon Meade, who punished but did not destroy Lee at Gettysburg. When the rising Potomac trapped Lee in Maryland, Lincoln urged Meade to close in for the kill. If Meade could "complete his work, so gloriously prosecuted thus far," said Lincoln, "by the literal or substantial destruction of Lee's army, the rebellion will be over." Instead, Meade pursued the retreating Confederates slowly and tentatively, and failed to attack them before they managed to retreat safely over the Potomac on the night of July 13-14. Lincoln had been distressed by Meade's congratulatory order to his army on July 4, which closed with the words that the country now "looks to the army for greater efforts to drive from our soil every vestige of the presence of the invader." "Great God!" cried Lincoln. "This is a dreadful reminiscence of McClellan," who had proclaimed a great victory when the enemy retreated across the river after Antietam. "Will our Generals never get that idea out of their heads? The whole country is our soil." That, after all, was the point of the war. When word came that Lee had escaped, Lincoln was both angry and depressed. He wrote to Meade: "My dear general, I do not believe you appreciate the magnitude of the misfortune involved in Lee's escape....Your golden opportunity is gone, and I am distressed immeasurably because of it." Having gotten these feelings off his chest, Lincoln filed the letter away unsent. But he never changed his mind. And two months later, when the Army of the Potomac was maneuvering and skirmishing again over the devastated land between Washington and Richmond, the president declared that "to attempt to fight the enemy back to his intrenchments in Richmond...is an idea I have been trying to repudiate for quite a year." Five times in the war Lincoln tried to get his field commanders to trap enemy armies that were raiding or invading northward by cutting in south of them and blocking their routes of retreat: during Stonewall Jackson's drive north through the Shenandoah Valley in May 1862; Lee's invasion of Maryland in September 1862; Braxton Bragg's and Edmund Kirby Smith's invasions of Kentucky in the same month; Lee's invasion of Pennsylvania in the Gettysburg campaign; and Jubal Early's raid to the outskirts of Washington in July 1864. Each time his generals failed him, and in most cases they soon found themselves relieved of command. In all of these instances the slowness of Union armies trying to intercept or pursue the enemy played a key part in their failures. Lincoln expressed repeated frustration with the inability of his armies to march as light and fast as Confederate armies. Much better supplied than the enemy, Union forces were actually slowed down by the abundance of their logistics. Most Union commanders never learned the lesson pronounced by Confederate Gen. Richard Ewell that "the road to glory cannot be followed with much baggage." Lincoln's efforts to get his commanders to move faster with fewer supplies brought him into active participation at the operational level of his armies. In May 1862 he directed General Irvin McDowell to "put all possible energy and speed into the effort" to trap Jackson in the Shenandoah Valley. Lincoln probably did not fully appreciate the logistical difficulties of moving large bodies of troops, especially in enemy territory. On the other hand, the president did comprehend the reality expressed by the Army of the Potomac's quartermaster in response to McClellan's incessant requests for more supplies before he could advance after Antietam, that "an army will never move if it waits until all the different commanders report that they are ready and want no more supplies." Lincoln told another general in November 1862 that "this expanding, and piling up of impedimenta, has been, so far, almost our ruin, and will be our final ruin if it is not abandoned....You would be better off.... for not having a thousand wagons, doing nothing but hauling forage to feed the animals that draw them, and taking at least two thousand men to care for the wagons and animals, who otherwise might be two thousand good soldiers." With Grant and Sherman, Lincoln finally had top generals who followed Ewell's dictum about the road to glory and who were willing to demand of their soldiers—and of themselves—the same exertions and sacrifices that Confederate commanders required of theirs. After the 1863 Vicksburg campaign that captured a key stronghold in Mississippi, Lincoln said of General Grant—whose rapid mobility and absence of a cumbersome supply line were a key to its success—that "Grant is my man and I am his the rest of the war!" Lincoln had opinions about battlefield tactics, but he rarely made suggestions to his field commanders for that level of operations. One exception, however, occurred in the second week of May 1862. Upset by McClellan's monthlong siege of Yorktown without any apparent result, Lincoln and Secretary of War Stanton and Secretary of the Treasury Salmon P. Chase sailed down to Hampton Roads on May 5 to discover that the Confederates had evacuated Yorktown before McClellan could open with his siege artillery. Norfolk remained in enemy hands, however, and the feared CSS Virginia (formerly the Merrimack) was still docked there. On May 7, Lincoln took direct operational control of a drive to capture Norfolk and to push a gunboat fleet up the James River. The president ordered Gen. John Wool, commander at Fort Monroe, to land troops on the south bank of Hampton Roads. Lincoln even personally carried out a reconnaissance to select the best landing place. On May 9, the Confederates evacuated Norfolk before Northern soldiers could get there. Two days later the Virginia's crew blew her up to prevent her capture. Chase rarely found opportunities to praise Lincoln, but on this occasion he wrote to his daughter: "So has ended a brilliant week's campaign of the President; for I think it quite certain that if he had not come down, Norfolk would still have been in possession of the enemy, and the 'Merrimac' as grim and defiant and as much a terror as ever....The whole coast is now virtually ours." Chase exaggerated, for the Confederates would have had to abandon Norfolk to avoid being cut off when Johnston's army retreated up the north side of the James River. But Chase's words can perhaps be applied to Lincoln's performance as commander in chief in the war as a whole. He enunciated a clear national policy, and through trial and error evolved national and military strategies to achieve it. The nation did not perish from the earth but experienced a new birth of freedom. Reprint from Our Lincoln: New Perspectives on Lincoln and His World, edited by Eric. Foner. Copyright © 2008 by W.W. Norton & Co. Inc. "A. Lincoln, Commander in Chief" copyright © by James M. McPherson. With the permission of the publisher, W.W. Norton & Co. Inc
d594ab43bd0a8eac65369a62d251dd98
https://www.smithsonianmag.com/history/lincolns-missing-bodyguard-12932069/
The Assassination of Abraham Lincoln
The Assassination of Abraham Lincoln When a celebrity-seeking couple crashed a White House state dinner last November, the issue of presidential security dominated the news. The Secret Service responded by putting three of its officers on administrative leave and scrambled to reassure the public that it takes the job of guarding the president very seriously. “We put forth the maximum effort all the time,” said Secret Service spokesman Edwin Donovan. That kind of dedication to safeguarding the president didn’t always exist. It wasn’t until 1902 that the Secret Service, created in 1865 to eradicate counterfeit currency, assumed official full-time responsibility for protecting the president. Before that, security for the president could be unbelievably lax. The most astounding example was the scant protection afforded Abraham Lincoln on the night he was assassinated. Only one man, an unreliable Washington cop named John Frederick Parker, was assigned to guard the president at Ford’s Theatre on April 14, 1865. Today it’s hard to believe that a single policeman was Lincoln’s only protection, but 145 years ago the situation wasn’t that unusual. Lincoln was cavalier about his personal safety, despite the frequent threats he received and a near-miss attempt on his life in August 1864, as he rode a horse unescorted. He’d often take in a play or go to church without guards, and he hated being encumbered by the military escort assigned to him. Sometimes he walked alone at night between the White House and the War Department, a distance of about a quarter of a mile. John Parker was an unlikely candidate to guard a president—or anyone for that matter. Born in Frederick County, Virginia, in 1830, Parker moved to Washington as a young man, originally earning his living as a carpenter. He became one of the capital’s first officers when the Metropolitan Police Force was organized in 1861. Parker’s record as a cop fell somewhere between pathetic and comical. He was hauled before the police board numerous times, facing a smorgasbord of charges that should have gotten him fired. But he received nothing more than an occasional reprimand. His infractions included conduct unbecoming an officer, using intemperate language and being drunk on duty. Charged with sleeping on a streetcar when he was supposed to be walking his beat, Parker declared that he’d heard ducks quacking on the tram and had climbed aboard to investigate. The charge was dismissed. When he was brought before the board for frequenting a whorehouse, Parker argued that the proprietress had sent for him. In November 1864, the Washington police force created the first permanent detail to protect the president, made up of four officers. Somehow, John Parker was named to the detail. Parker was the only one of the officers with a spotty record, so it was a tragic coincidence that he drew the assignment to guard the president that evening. As usual, Parker got off to a lousy start that fateful Friday. He was supposed to relieve Lincoln’s previous bodyguard at 4 p.m. but was three hours late. Lincoln’s party arrived at the theater at around 9 p.m. The play, Our American Cousin, had already started when the president entered his box directly above the right side of the stage. The actors paused while the orchestra struck up “Hail to the Chief.” Lincoln bowed to the applauding audience and took his seat. Parker was seated outside the president’s box, in the passageway beside the door. From where he sat, Parker couldn’t see the stage, so after Lincoln and his guests settled in, he moved to the first gallery to enjoy the play. Later, Parker committed an even greater folly: At intermission, he joined the footman and coachman of Lincoln’s carriage for drinks in the Star Saloon next door to Ford’s Theatre. John Wilkes Booth entered the theater around 10 p.m.. Ironically, he’d also been in the Star Saloon, working up some liquid courage. When Booth crept up to the door to Lincoln’s box, Parker’s chair stood empty. Some of the audience may not have heard the fatal pistol shot, since Booth timed his attack to coincide with a scene in the play that always sparked loud laughter. No one knows for sure if Parker ever returned to Ford’s Theatre that night. When Booth struck, the vanishing policeman may have been sitting in his new seat with a nice view of the stage, or perhaps he had stayed put in the Star Saloon. Even if he had been at his post, it’s not certain he would have stopped Booth. “Booth was a well-known actor, a member of a famous theatrical family,” says Ford’s Theatre historical interpreter Eric Martin. “They were like Hollywood stars today. Booth might have been allowed in to pay his respects. Lincoln knew of him. He’d seen him act in The Marble Heart, here in Ford’s Theatre in 1863.” A fellow presidential bodyguard, William H. Crook, wouldn’t accept any excuses for Parker. He held him directly responsible for Lincoln’s death. “Had he done his duty, I believe President Lincoln would not have been murdered by Booth,” Crook wrote in his memoir. “Parker knew that he had failed in duty. He looked like a convicted criminal the next day.” Parker was charged with failing to protect the president, but the complaint was dismissed a month later. No local newspaper followed up on the issue of Parker’s culpability. Nor was Parker mentioned in the official report on Lincoln’s death. Why he was let off so easily is baffling. Perhaps, with the hot pursuit of Booth and his co-conspirators in the chaotic aftermath, he seemed like too small a fish. Or perhaps the public was unaware that a bodyguard had even been assigned to the president. Incredibly, Parker remained on the White House security detail after the assassination. At least once he was assigned to protect the grieving Mrs. Lincoln before she moved out of the presidential mansion and returned to Illinois. Mrs. Lincoln’s dressmaker, former slave Elizabeth Keckley, recalled the following exchange between the president’s widow and Parker: “So you are on guard tonight,” Mrs. Lincoln yelled, “on guard in the White House after helping to murder the President.” “I could never stoop to murder,” Parker stammered, “much less to the murder of so good and great a man as the President. I did wrong, I admit, and have bitterly repented. I did not believe any one would try to kill so good a man in such a public place, and the belief made me careless.” Mrs. Lincoln snapped that she would always consider him guilty and ordered him from the room. Some weeks before the assassination, she had written a letter on Parker’s behalf to exempt him from the draft, and some historians think she may have been related to him on her mother’s side. Parker remained on the Metropolitan Police Force for three more years, but his shiftlessness finally did him in. He was fired on August 13, 1868, for once again sleeping on duty. Parker drifted back into carpentry. He died in Washington in 1890, of pneumonia. Parker, his wife and their three children are buried together in the capital’s Glenwood Cemetery—on present-day Lincoln Road. Their graves are unmarked. No photographs have ever been found of John Parker. He remains a faceless character, his role in the great tragedy largely forgotten.
75db18198d622d1927c619257e25e780
https://www.smithsonianmag.com/history/lincolns-missing-bodyguard-12932069/?all&no-ist
The Assassination of Abraham Lincoln
The Assassination of Abraham Lincoln When a celebrity-seeking couple crashed a White House state dinner last November, the issue of presidential security dominated the news. The Secret Service responded by putting three of its officers on administrative leave and scrambled to reassure the public that it takes the job of guarding the president very seriously. “We put forth the maximum effort all the time,” said Secret Service spokesman Edwin Donovan. That kind of dedication to safeguarding the president didn’t always exist. It wasn’t until 1902 that the Secret Service, created in 1865 to eradicate counterfeit currency, assumed official full-time responsibility for protecting the president. Before that, security for the president could be unbelievably lax. The most astounding example was the scant protection afforded Abraham Lincoln on the night he was assassinated. Only one man, an unreliable Washington cop named John Frederick Parker, was assigned to guard the president at Ford’s Theatre on April 14, 1865. Today it’s hard to believe that a single policeman was Lincoln’s only protection, but 145 years ago the situation wasn’t that unusual. Lincoln was cavalier about his personal safety, despite the frequent threats he received and a near-miss attempt on his life in August 1864, as he rode a horse unescorted. He’d often take in a play or go to church without guards, and he hated being encumbered by the military escort assigned to him. Sometimes he walked alone at night between the White House and the War Department, a distance of about a quarter of a mile. John Parker was an unlikely candidate to guard a president—or anyone for that matter. Born in Frederick County, Virginia, in 1830, Parker moved to Washington as a young man, originally earning his living as a carpenter. He became one of the capital’s first officers when the Metropolitan Police Force was organized in 1861. Parker’s record as a cop fell somewhere between pathetic and comical. He was hauled before the police board numerous times, facing a smorgasbord of charges that should have gotten him fired. But he received nothing more than an occasional reprimand. His infractions included conduct unbecoming an officer, using intemperate language and being drunk on duty. Charged with sleeping on a streetcar when he was supposed to be walking his beat, Parker declared that he’d heard ducks quacking on the tram and had climbed aboard to investigate. The charge was dismissed. When he was brought before the board for frequenting a whorehouse, Parker argued that the proprietress had sent for him. In November 1864, the Washington police force created the first permanent detail to protect the president, made up of four officers. Somehow, John Parker was named to the detail. Parker was the only one of the officers with a spotty record, so it was a tragic coincidence that he drew the assignment to guard the president that evening. As usual, Parker got off to a lousy start that fateful Friday. He was supposed to relieve Lincoln’s previous bodyguard at 4 p.m. but was three hours late. Lincoln’s party arrived at the theater at around 9 p.m. The play, Our American Cousin, had already started when the president entered his box directly above the right side of the stage. The actors paused while the orchestra struck up “Hail to the Chief.” Lincoln bowed to the applauding audience and took his seat. Parker was seated outside the president’s box, in the passageway beside the door. From where he sat, Parker couldn’t see the stage, so after Lincoln and his guests settled in, he moved to the first gallery to enjoy the play. Later, Parker committed an even greater folly: At intermission, he joined the footman and coachman of Lincoln’s carriage for drinks in the Star Saloon next door to Ford’s Theatre. John Wilkes Booth entered the theater around 10 p.m.. Ironically, he’d also been in the Star Saloon, working up some liquid courage. When Booth crept up to the door to Lincoln’s box, Parker’s chair stood empty. Some of the audience may not have heard the fatal pistol shot, since Booth timed his attack to coincide with a scene in the play that always sparked loud laughter. No one knows for sure if Parker ever returned to Ford’s Theatre that night. When Booth struck, the vanishing policeman may have been sitting in his new seat with a nice view of the stage, or perhaps he had stayed put in the Star Saloon. Even if he had been at his post, it’s not certain he would have stopped Booth. “Booth was a well-known actor, a member of a famous theatrical family,” says Ford’s Theatre historical interpreter Eric Martin. “They were like Hollywood stars today. Booth might have been allowed in to pay his respects. Lincoln knew of him. He’d seen him act in The Marble Heart, here in Ford’s Theatre in 1863.” A fellow presidential bodyguard, William H. Crook, wouldn’t accept any excuses for Parker. He held him directly responsible for Lincoln’s death. “Had he done his duty, I believe President Lincoln would not have been murdered by Booth,” Crook wrote in his memoir. “Parker knew that he had failed in duty. He looked like a convicted criminal the next day.” Parker was charged with failing to protect the president, but the complaint was dismissed a month later. No local newspaper followed up on the issue of Parker’s culpability. Nor was Parker mentioned in the official report on Lincoln’s death. Why he was let off so easily is baffling. Perhaps, with the hot pursuit of Booth and his co-conspirators in the chaotic aftermath, he seemed like too small a fish. Or perhaps the public was unaware that a bodyguard had even been assigned to the president. Incredibly, Parker remained on the White House security detail after the assassination. At least once he was assigned to protect the grieving Mrs. Lincoln before she moved out of the presidential mansion and returned to Illinois. Mrs. Lincoln’s dressmaker, former slave Elizabeth Keckley, recalled the following exchange between the president’s widow and Parker: “So you are on guard tonight,” Mrs. Lincoln yelled, “on guard in the White House after helping to murder the President.” “I could never stoop to murder,” Parker stammered, “much less to the murder of so good and great a man as the President. I did wrong, I admit, and have bitterly repented. I did not believe any one would try to kill so good a man in such a public place, and the belief made me careless.” Mrs. Lincoln snapped that she would always consider him guilty and ordered him from the room. Some weeks before the assassination, she had written a letter on Parker’s behalf to exempt him from the draft, and some historians think she may have been related to him on her mother’s side. Parker remained on the Metropolitan Police Force for three more years, but his shiftlessness finally did him in. He was fired on August 13, 1868, for once again sleeping on duty. Parker drifted back into carpentry. He died in Washington in 1890, of pneumonia. Parker, his wife and their three children are buried together in the capital’s Glenwood Cemetery—on present-day Lincoln Road. Their graves are unmarked. No photographs have ever been found of John Parker. He remains a faceless character, his role in the great tragedy largely forgotten.
48f41618f6165cbf002b8c7ae66e10f3
https://www.smithsonianmag.com/history/little-brother-of-war-147315888/
Little Brother of War
Little Brother of War "Sticks" such as those at left were the principal weapons used in a semi-sacred ball sport variously known as "They Bump Hips" or the "Little Brother of War" that American Indians believe was given to them by the Creator sometime in ages past. This pair is part of the American Indian exhibit in the Smithsonian's Arts and Industries Building. They were made only a century or so ago by Tuscarora Iroquois craftsmen using hickory and rawhide, the wood for their curved heads steamed for hours, then bent around a crook-shaped block. More than three feet long and weighing a couple of pounds, they would seem unwieldy to modern lacrosse players, who pass the ball around and whack at each other with 12-ounce sticks of plastic, titanium and nylon. But they are symbols of triumph for a Native American culture that has otherwise been largely ignored, if not eradicated, by the modern white world. Year by year lacrosse grows more popular in North America (there are some 2,000 high school and more than 500 college teams in the United States alone) as well as in other parts of the globe from Japan to Germany and the Czech Republic. (When the Czechs first took up the game in the late 1970s, they reportedly used as a guide George Catlin's famous 1834 painting of Choctaws playing the game.) Yet lacrosse remains a uniquely Indian sport, requiring fierce competitiveness, speed and endurance, remarkable dexterity and tolerance of pain. These days, of course, it is not lacrosse but professional football--with hockey as a close second--that people might reasonably describe as the "Little Brother of War." As played today, men's lacrosse involves ten players per team and lasts 60 minutes in a space roughly the size of a football field. It is still a game of hard knocks and bruises, played with fast-paced, passionate zeal by men and women. A remarkable witness to the demands and fascinations of the game is football's legendary running back Jim Brown. "Lacrosse is my favorite game," says Brown. "It takes tremendous endurance and skill." According to Rick Hill, Sr., a lacrosse stalwart and a professor of Native American studies at the State University of New York at Buffalo, little is known about the two Smithsonian sticks. But studies by Smithsonian researcher Thomas Vennum, Jr., author of American Indian Lacrosse: Little Brother of War (Smithsonian Press, 1994), suggest that in design lacrosse sticks are descendants of war clubs. The butt of one elaborately carved stick at the University of Pennsylvania, crafted a century and a half ago, represents a hand holding a ball. Alongside it on the shaft is a carving of a handshake. The clasped hands, Vennum says, are not necessarily friendly. They may be symbolic of a dance in which warriors clasped hands to "strengthen themselves . . . as protective medicine" for battle. Some experts regard the carved ball in the hand as some kind of medicine ball, but Vennum thinks it is also linked to the ball end of war clubs, often carved as if held in the mouth of a snake or the claws of a bird of prey. The idea was that when such clubs were used in battle, the snake or hawk symbolically loosed its grip, sending the ball flying through the air to strike an enemy's head and kill him. Sometimes the ball was carved as a human head that would fly off the club's handle and smack an enemy brave. One Iroquois legend tells of a flying head pursuing a whole family, bent on its annihilation. At the last second the ball is caught and thrown to its death in a vat of boiling bear grease. As the game was played by its original inventors, from 30 to 50 players might take part on vast ball fields without sidelines whose variable length was determined by both teams prior to the match. Games lasted for days at times, and in some tribes players and nonplayers alike bet ponies, fortunes in fur and beadwork, even wives and children, on the outcome. Early French and English settlers at first were both startled and horrified by the game. "Almost everything short of murder is allowable," one noted. "If one were not told beforehand that they were playing," another wrote, "one would certainly believe that they were fighting." Soon, however, they fell under the spell of the game, learning to watch (and place side bets) among themselves. So much so that lacrosse played a role during the period of Pontiac's Rebellion in which several Indian nations fought to reclaim lands from occupying British forces in what is now the Midwest. In 1763, during King George III's birthday celebration, Indians staged a game outside Fort Michilimackinac on Lake Michigan. While His Majesty's soldiers were caught up in the game's progress, warriors took the fort. The later history of the "Little Brother of War" was sometimes as contentious as the relationship between Indians and Euro-Americans. According to U.S. Lacrosse, the Baltimore-based national governing body of the sport, white Canadians were playing as early as 1839. By 1856 in Montreal the first non-Indian team had been organized, and in 1860 a Canadian dentist, Dr. William George Beers, wrote the first Europeanized rules. For a while lacrosse was promoted as the national game of Canada. Native American teams toured Europe playing exhibition games, including one for the benefit of Queen Victoria. Then, in 1880, the National Lacrosse Association of Canada banned Indians from championship play--officially on the grounds that the Indians were paid "professionals" not eligible for "amateur" sports. By that time the game was catching on in North American prep schools and colleges, with a scattering of Indian varsity players at such schools as Dartmouth and (later) Syracuse. Today in Indian communities all over North America at the first sign of spring youngsters sally forth carrying lacrosse sticks. Many Indian players still request to be buried with their sticks beside them. The tradition of carved wooden lacrosse sticks still flourishes as well. In the Tuscarora Nation, near Sanborn, New York, Tuskewe Krafts, a firm owned by John Wesley Patterson, Jr., turns out 10,000 sticks a year at prices running from $60 to $90. For many Indians in ancient days, and today as well, a lacrosse game was a ceremonial replay of the Creation story, and of the struggle between good and evil that followed it. The game could also be worldly practical--mock war used for diplomatic purposes or as a prudent step back from the threat of war. The story, retold by Vennum, of two lacrosse games played almost exactly 200 years ago between the Mohawk and the Seneca seems to offer a case in point. Both belonged to the powerful league of Six Nations, the Iroquois confederacy that also included the Onondaga, Cayuga, Oneida and Tuscarora. The year was 1794. After the French and Indian Wars and the American Revolution, whites were again threatening Indian holdings in what is now Ohio and western New York. Chief Joseph Brant (Thayendanegea, in Mohawk), a powerful chief who had sided with the British during the Revolution, was negotiating with them for land in Canada, but the site offered was unacceptable. The Seneca agreed; if they took it the Mohawk would be isolated from the rest of the Six Nations. When Seneca intervention resulted in a better site for the Mohawk, Brant set up a ceremonial lacrosse match in part, Vennum speculates, to celebrate the Seneca help. There was also bad blood between Brant and Red Jacket, an influential Seneca chief, going back to a time when Brant had called him a "cow killer," because it was said Red Jacket sent Seneca warriors off to battle while he stayed at home butchering their cows for himself. The match may have represented a fence-mending effort on Brant's part. If so, it apparently hit a snag. During the game, according to a report written at the time and cited in a biography of Brant published in 1838, a Mohawk lost his temper and "struck a sharp blow" to his opponent with his stick. All action stopped, the story goes; the Seneca team walked off the field. The Mohawk and the Seneca did not play each other again until 1797. But they kept on playing, and so did the other Iroquois nations. Lacrosse, in fact, was one of the things that helped hold the Six Nations together through the difficult years that followed. In 1990 the Iroquois Nationals, an all-Iroquois lacrosse team, traveled to Australia for the world championship under their own flag and carrying Iroquois passports. "We stood tall," says Rick Hill. "For a few moments the lacrosse-playing nations (England, Japan, Australia, the Czech Republic, the United States, Canada, Wales, Scotland, Sweden, Germany) saluted our national flag. It was quite a change after 200 years." By Adele Conover
0f1ec8c68344b6c55ae51d25893c99b3
https://www.smithsonianmag.com/history/long-history-disease-and-fear-other-180953310/
The Long History of Disease and the Fear of the “Other”
The Long History of Disease and the Fear of the “Other” Health consists of having the same diseases as one’s neighbors,” the English writer Quentin Crisp once quipped. He was right. And what is true of the individual seems to be true of societies as a whole. “Parasite stress,” as scientists term it, has long been a factor in human relations, intensifying the fear and loathing of other peoples. For a while, it seemed that we had transcended all that. But, as Ebola reminds us, fundamental problems remain. No longer confined to remote rural locations, Ebola has become an urban disease and has spread uncontrollably in some western African nations, in the absence of effective healthcare. Ebola has also revived the Victorian image of Africa as a dark continent teeming with disease. And the dread of Ebola is no longer confined to the West. Indeed, it tends to be more apparent throughout Asia than among Americans and Europeans. In August, Korean Air terminated its only direct flight to Africa due to Ebola concerns, never mind that the destination was nowhere near the affected region of the continent, but thousands of miles to the east in Nairobi. North Korea has also recently suspended visits from all foreign visitors – regardless of origin. Anxiety about Ebola is more acute in Asia because epidemics, poverty, and famine are well within living memory. The roots of this mentality lie deep in our history. After humans mastered the rudiments of agriculture 12,000 years ago, they began to domesticate a greater variety of animals and came into contact with a wider range of infections. But this happened at different times in different places, and the resulting imbalance gave rise to the notion that some places were more dangerous than others. Thus, when the disease we call syphilis was first encountered in Europe in the late 1490s, it was labelled the Neapolitan or French disease, depending on where one happened to live. And, when the same disease arrived in India, with Portuguese sailors, it was called firangi roga, or the disease of the Franks (a term synonymous with “European”). The influenza that spread around the world from 1889 to 90 was dubbed the “Russian Flu” (for no good reason) and the same was true of the “Spanish Flu” of 1918 to 19. It is safe to assume they were not called these names in Russia or Spain. We are still inclined to think of epidemic disease as coming from somewhere else, brought to our doorstep by outsiders. Notions of infection first developed within a religious framework – pestilence came to be associated with vengeful deities who sought to punish transgressors or unbelievers. In the European plagues of 1347 to 51 (the “Black Death”), Jews were made scapegoats and killed in substantial numbers. But the Black Death began a process whereby disease was gradually, albeit partially, secularized. With nearly half the population dead from plague, manpower was precious and many rulers attempted to preserve it, as well as to reduce the disorder that usually accompanied an epidemic. Disease became the trigger for new forms of intervention and social separation. Within states, it was the poor who came to be stigmatized as carriers of infection, on account of their supposedly unhygienic and ungodly habits. Countries began to use the accusation of disease to blacken the reputation of rival nations and damage their trade. Quarantines and embargoes became a form of war by other means and were manipulated cynically, often pandering to popular prejudice. The threat of disease was frequently used to stigmatize immigrants and contain marginalized peoples. The actual numbers of immigrants turned away at inspection stations such as Ellis Island was relatively small but the emphasis placed on screening certain minorities helped shape public perceptions. During an epidemic of cholera in 1892, President Benjamin Harrison notoriously referred to immigrants as a “direct menace to public health,” singling out Russian Jews as a special danger. But as the global economy matured constraints such as quarantine and embargoes became cumbersome. The panicky response to the re-emergence of plague in the 1890s, in cities such as Hong Kong, Bombay, Sydney and San Francisco, created enormous disruption. Trade came to a standstill and many businesses were destroyed. Great Britain and the United States proposed a different way of dealing with disease based less on stoppages and more on surveillance and selective intervention. Combined with sanitary reform in the world’s greatest ports, these measures were able to arrest epidemic diseases without disrupting commerce. The international sanitary agreements of the early 1900s marked a rare example of cooperation in a world otherwise fractured by imperial and national rivalries. The present effort to contain Ebola will probably succeed now that more personnel and resources have been sent to the afflicted countries. But our long-term security depends on the development of a more robust global health infrastructure capable of pre-emptive strikes against emerging infections. If there is one positive thing to note about the reaction to Ebola it is that governments have responded, albeit belatedly, to growing public demand. A more inclusive, global identity appears to be emerging, with a substantially recalibrated understanding of our cross-border responsibilities in the realm of health. Whether this awareness and improvised crisis management translates into a long-lasting shift in how we tackle fast-spreading contagions remains an open question – a life-and-death one. Mark Harrison is Professor of the History of Medicine and Director of the Wellcome Unit for the History of Medicine, Oxford University. He is author of Contagion: How Commerce has Spread Disease (Yale University Press, 2013). He wrote this for Zocalo Public Square.
b1c9df4ffbc5b66e01e3ff26feb826f6
https://www.smithsonianmag.com/history/long-lasting-legacy-great-migration-180960118/
The Long-Lasting Legacy of the Great Migration
The Long-Lasting Legacy of the Great Migration In 1963, the American mathematician Edward Lorenz, taking a measure of the earth’s atmosphere in a laboratory that would seem far removed from the social upheavals of the time, set forth the theory that a single “flap of a sea gull’s wings” could redirect the path of a tornado on another continent, that it could, in fact, be “enough to alter the course of the weather forever,” and that, though the theory was then new and untested, “the most recent evidence would seem to favor the sea gulls.” The Warmth of Other Suns: The Epic Story of America's Great Migration At that moment in American history, the country had reached a turning point in a fight for racial justice that had been building for decades. This was the year of the killing of Medgar Evers in Mississippi, of the bombing of the 16th Street Baptist Church in Birmingham, of Gov. George Wallace blocking black students at the schoolhouse door of the University of Alabama, the year of the March on Washington, of Martin Luther King Jr.’s “I Have a Dream” speech and his “Letter From a Birmingham Jail.” By then, millions of African-Americans had already testified with their bodies to the repression they had endured in the Jim Crow South by defecting to the North and West in what came to be known as the Great Migration. They were fleeing a world where they were restricted to the most menial of jobs, underpaid if paid at all, and frequently barred from voting. Between 1880 and 1950, an African-American was lynched more than once a week for some perceived breach of the racial hierarchy. “They left as though they were fleeing some curse,” wrote the scholar Emmett J. Scott, an observer of the early years of the migration. “They were willing to make almost any sacrifice to obtain a railroad ticket and they left with the intention of staying.” The migration began, like the flap of a sea gull’s wings, as a rivulet of black families escaping Selma, Alabama, in the winter of 1916. Their quiet departure was scarcely noticed except for a single paragraph in the Chicago Defender, to whom they confided that “the treatment doesn’t warrant staying.” The rivulet would become rapids, which grew into a flood of six million people journeying out of the South over the course of six decades. They were seeking political asylum within the borders of their own country, not unlike refugees in other parts of the world fleeing famine, war and pestilence. Until that moment and from the time of their arrival on these shores, the vast majority of African-Americans had been confined to the South, at the bottom of a feudal social order, at the mercy of slaveholders and their descendants and often-violent vigilantes. The Great Migration was the first big step that the nation’s servant class ever took without asking. “Oftentimes, just to go away is one of the most aggressive things that another person can do,” wrote John Dollard, an anthropologist studying the racial caste system of the South in the 1930s, “and if the means of expressing discontent are limited, as in this case, it is one of the few ways in which pressure can be put on.” The refugees could not know what was in store for them and for their descendants at their destinations or what effect their exodus would have on the country. But by their actions, they would reshape the social and political geography of every city they fled to. When the migration began, 90 percent of all African-Americans were living in the South. By the time it was over, in the 1970s, 47 percent of all African-Americans were living in the North and West. A rural people had become urban, and a Southern people had spread themselves all over the nation. This article is a selection from the September issue of Smithsonian magazine Merely by leaving, African-Americans would get to participate in democracy and, by their presence, force the North to pay attention to the injustices in the South and the increasingly organized fight against those injustices. By leaving, they would change the course of their lives and those of their children. They would become Richard Wright the novelist instead of Richard Wright the sharecropper. They would become John Coltrane, jazz musician instead of tailor; Bill Russell, NBA pioneer instead of paper mill worker; Zora Neale Hurston, beloved folklorist instead of maidservant. The children of the Great Migration would reshape professions that, had their families not left, may never have been open to them, from sports and music to literature and art: Miles Davis, Ralph Ellison, Toni Morrison, August Wilson, Jacob Lawrence, Diana Ross, Tupac Shakur, Prince, Michael Jackson, Shonda Rhimes, Venus and Serena Williams and countless others. The people who migrated would become the forebears of most African-Americans born in the North and West. The Great Migration would expose the racial divisions and disparities that in many ways continue to plague the nation and dominate headlines today, from police killings of unarmed African-Americans to mass incarceration to widely documented biases in employment, housing, health care and education. Indeed, two of the most tragically recognizable descendants of the Great Migration are Emmett Till, a 14-year-old Chicago boy killed in Mississippi in 1955, and Tamir Rice, a 12-year-old Cleveland boy shot to death by police in 2014 in the city where his ancestors had fled. Their fates are a reminder that the perils the people sought to escape were not confined to the South, nor to the past. The history of African-Americans is often distilled into two epochs: the 246 years of enslavement ending after the close of the Civil War, and the dramatic era of protest during the civil rights movement. Yet the Civil War-to-civil rights axis tempts us to leap past a century of resistance against subjugation, and to miss the human story of ordinary people, their hopes lifted by Emancipation, dashed at the end of Reconstruction, crushed further by Jim Crow, only to be finally, at long last, revived when they found the courage within themselves to break free. ********** A little boy boarded a northbound train with his grandmother and extended family, along with their upright piano and the rest of their worldly possessions, stuffed inside wooden crates, to begin their journey out of Mississippi. It was 1935. They were packed into the Jim Crow car, which, by custom, was at the front of the train, the first to absorb the impact in the event of a collision. They would not be permitted into the dining car, so they carried fried chicken and boiled eggs to tide them over for the journey. The little boy was 4 years old and anxious. He’d overheard the grown-ups talking about leaving their farm in Arkabutla, to start over up north. He heard them say they might leave him with his father’s people, whom he didn’t know. In the end they took him along. The near abandonment haunted him. He missed his mother, who would not be joining them on this journey; she was away trying to make a stable life for herself after the breakup with his father. He did not know when he would see her again. His grandfather had preceded them north. He was a hardworking, serious man who kept the indignities he suffered under Jim Crow to himself. In Mississippi, he had not dared stand up to some white children who broke the family’s wagon. He told the little boy that as black people, they had no say in that world. “There were things they could do that we couldn’t,” the boy would say of the white children when he was a grown man with gray hair and a son of his own. The grandfather was so determined to get his family out of the South that he bought a plot of land sight unseen in a place called Michigan. On the trip north, the little boy and his cousins and uncles and aunts (who were children themselves) did not quite know what Michigan was, so they made a ditty out of it and sang it as they waited for the train. “Meatskin! Meatskin! We’re going to Meatskin!” They landed on freer soil, but between the fears of abandonment and the trauma of being uprooted from his mother, the little boy arrived with a stutter. He began to speak less and less. At Sunday school, the children bellowed with laughter whenever he tried. So instead, he talked to the hogs and cows and chickens on the farm, who, he said years later, “don’t care how you sound.” The little boy went mute for eight years. He wrote down the answers to questions he was asked, fearing even to introduce himself to strangers, until a high school English teacher coaxed him out of his silence by having him read poetry aloud to the class. That boy was James Earl Jones. He would go on to the University of Michigan, where he abandoned pre-med for theater. Later he would play King Lear in Central Park and Othello on Broadway, win Tony Awards for his performances in Fences and in The Great White Hope and star in films like Dr. Strange­love, Roots, Field of Dreams and Coming to America. The voice that fell silent for so long would become among the most iconic of our time—the voice of Darth Vader in Star Wars, of Mufasa in The Lion King, the voice of CNN. Jones lost his voice, and found it, because of the Great Migration. “It was responsible for all that I am grateful for in my life,” he told me in a recent interview in New York. “We were reaching for our gold mines, our freedom.” ********** The desire to be free is, of course, human and universal. In America, enslaved people had tried to escape through the Underground Railroad. Later, once freed on paper, thousands more, known as Exodusters, fled the violent white backlash following Reconstruction in a short-lived migration to Kansas in 1879. But concentrated in the South as they were, held captive by the virtual slavery of sharecropping and debt peonage and isolated from the rest of the country in the era before airlines and interstates, many African-Americans had no ready means of making a go of it in what were then faraway alien lands. By the opening of the 20th century, the optimism of the Reconstruction era had long turned into the terror of Jim Crow. In 1902, one black woman in Alabama seemed to speak for the agitated hearts that would ultimately propel the coming migration: “In our homes, in our churches, wherever two or three are gathered together,” she said, “there is a discussion of what is best to do. Must we remain in the South or go elsewhere? Where can we go to feel that security which other people feel? Is it best to go in great numbers or only in several families? These and many other things are discussed over and over.” The door of escape opened during World War I, when slowing immigration from Europe created a labor shortage in the North. To fill the assembly lines, companies began recruiting black Southerners to work the steel mills, railroads and factories. Resistance in the South to the loss of its cheap black labor meant that recruiters often had to act in secret or face fines and imprisonment. In Macon, Georgia, for example, a recruiter’s license required a $25,000 fee plus the unlikely recommendations of 25 local businessmen, ten ministers and ten manufacturers. But word soon spread among black Southerners that the North had opened up, and people began devising ways to get out on their own. Southern authorities then tried to keep African-Americans from leaving by arresting them at the railroad platforms on grounds of “vagrancy” or tearing up their tickets in scenes that presaged tragically thwarted escapes from behind the Iron Curtain during the Cold War. And still they left. On one of the early trains out of the South was a sharecropper named Mallie Robinson, whose husband had left her to care for their young family under the rule of a harsh plantation owner in Cairo, Georgia. In 1920, she gathered up her five children, including a baby still in diapers, and, with her sister and brother-in-law and their children and three friends, boarded a Jim Crow train, and another, and another, and didn’t get off until they reached California. They settled in Pasadena. When the family moved into an all-white neighborhood, a cross was burned on their front lawn. But here Mallie’s children would go to integrated schools for the full year instead of segregated classrooms in between laborious hours chopping and picking cotton. The youngest, the one she had carried in her arms on the train out of Georgia, was named Jackie, who would go on to earn four letters in athletics in a single year at UCLA. Later, in 1947, he became the first African-American to play Major League Baseball. Had Mallie not persevered in the face of hostility, raising a family of six alone in the new world she had traveled to, we might not have ever known his name. “My mother never lost her composure,” Jackie Robinson once recalled. “As I grew older, I often thought about the courage it took for my mother to break away from the South.” Mallie was extraordinary in another way. Most people, when they left the South, followed three main tributaries: the first was up the East Coast from Florida, Georgia, the Carolinas and Virginia to Washington, D.C., Baltimore, Philadelphia, New York and Boston; the second, up the country’s central spine, from Alabama, Mississippi, Tennessee and Arkansas to St. Louis, Chicago, Cleveland, Detroit and the entire Midwest; the third, from Louisiana and Texas to California and the Western states. But Mallie took one of the farthest routes in the continental U.S. to get to freedom, a westward journey of more than 2,200 miles. The trains that spirited the people away, and set the course for those who would come by bus or car or foot, acquired names and legends of their own. Perhaps the most celebrated were those that rumbled along the Illinois Central Railroad, for which Abraham Lincoln had worked as a lawyer before his election to the White House, and from which Pullman porters distributed copies of the Chicago Defender in secret to black Southerners hungry for information about the North. The Illinois Central was the main route for those fleeing Mississippi for Chicago, people like Muddy Waters, the blues legend who made the journey in 1943 and whose music helped define the genre and pave the way for rock ’n’ roll, and Richard Wright, a sharecropper’s son from Natchez, Mississippi, who got on a train in 1927 at the age of 19 to feel what he called “the warmth of other suns.” In Chicago, Wright worked washing dishes and sweeping streets before landing a job at the post office and pursuing his dream as a writer. He began to visit the library: a right and pleasure that would have been unthinkable in his home state of Mississippi. In 1940, having made it to New York, he published Native Son to national acclaim, and, through this and other works, became a kind of poet laureate of the Great Migration. He seemed never to have forgotten the heartbreak of leaving his homeland and the courage he mustered to step into the unknown. “We look up at the high Southern sky,” Wright wrote in 12 Million Black Voices. “We scan the kind, black faces we have looked upon since we first saw the light of day, and, though pain is in our hearts, we are leaving.” Zora Neale Hurston arrived in the North along the East Coast stream from Florida, although, as was her way, she broke convention in how she got there. She had grown up as the willful younger daughter of an exacting preacher and his long-suffering wife in the all-black town of Eatonville. After her mother died, when she was 13, Hurston bounced between siblings and neighbors until she was hired as a maid with a traveling theater troupe that got her north, dropping her off in Baltimore in 1917. From there, she made her way to Howard University in Washington, where she got her first story published in the literary magazine Stylus while working odd jobs as a waitress, maid and manicurist. She continued on to New York in 1925 with $1.50 to her name. She would become the first black student known to graduate from Barnard College. There, she majored in English and studied anthropology, but was barred from living in the dormitories. She never complained. In her landmark 1928 essay “How It Feels to Be Colored Me,” she mocked the absurdity: “Sometimes, I feel discriminated against, but it does not make me angry,” she wrote. “It merely astonishes me. How can any deny themselves the pleasure of my company? It’s beyond me.” She arrived in New York when the Harlem Renaissance, an artistic and cultural flowering in the early years of the Great Migration, was in full bloom. The influx to the New York region would extend well beyond the Harlem Renaissance and draw the parents or grandparents of, among so many others, Denzel Washington (Virginia and Georgia), Ella Fitzgerald (Newport News, Virginia), the artist Romare Bearden (Charlotte, North Carolina), Whitney Houston (Blakeley, Georgia), the rapper Tupac Shakur (Lumberton, North Carolina), Sarah Vaughan (Virginia) and Althea Gibson (Clarendon County, South Carolina), the tennis champion who, in 1957, became the first black player to win at Wimbledon. From Aiken, South Carolina, and Bladenboro, North Carolina, the migration drew the parents of Diahann Carroll, who would become the first black woman to win a Tony Award for best actress and, in 1968, to star in her own television show in a role other than a domestic. It was in New York that the mother of Jacob Lawrence settled after a winding journey from Virginia to Atlantic City to Philadelphia and then on to Harlem. Once there, to keep teenage Jacob safe from the streets, she enrolled her eldest son in an after-school arts program that would set the course of his life. Lawrence would go on to create “The Migration Series”—60 painted panels, brightly colored like the throw rugs his mother kept in their tenement apartment. The paintings would become not only the best-known images of the Great Migration but among the most recognizable images of African-Americans in the 20th century. ********** Yet throughout the migration, wherever black Southerners went, the hostility and hierarchies that fed the Southern caste system seemed to carry over into the receiving stations in the New World, as the cities of the North and West erected barriers to black mobility. There were “sundown towns” throughout the country that banned African-Americans after dark. The constitution of Oregon explicitly prohibited black people from entering the state until 1926; whites-only signs could still be seen in store windows into the 1950s. Even in the places where they were permitted, blacks were relegated to the lowest-paying, most dangerous jobs, barred from many unions and, at some companies, hired only as strike breakers, which served to further divide black workers from white. They were confined to the most dilapidated housing in the least desirable sections of the cities to which they fled. In densely populated destinations like Pittsburgh and Harlem, housing was so scarce that some black workers had to share the same single bed in shifts. When African-Americans sought to move their families to more favorable conditions, they faced a hardening structure of policies and customs designed to maintain racial exclusion. Restrictive covenants, introduced as a response to the influx of black people during the Great Migration, were clauses written into deeds that outlawed African-Americans from buying, leasing or living in properties in white neighborhoods, with the exception, often explicitly spelled out, of servants. By the 1920s, the widespread use of restrictive covenants kept as much as 85 percent of Chicago off-limits to African-Americans. At the same time, redlining—the federal housing policy of refusing to approve or guarantee mortgages in areas where black people lived—served to deny them access to mortgages in their own neighborhoods. These policies became the pillars of a residential caste system in the North that calcified segregation and wealth inequality over generations, denying African-Americans the chance accorded other Americans to improve their lot. In the 1930s, a black couple in Chicago named Carl and Nannie Hansberry decided to fight these restrictions to make a better life for themselves and their four young children. They had migrated north during World War I, Carl from Mississippi and Nannie from Tennessee. He was a real estate broker, she was a schoolteacher, and they had managed to save up enough to buy a home. They found a brick three-flat with bay windows in the all-white neighborhood of Woodlawn. Although other black families moving into white neighborhoods had endured firebombings and mob violence, Carl wanted more space for his family and bought the house in secret with the help of progressive white real estate agents he knew. He moved the family late in the spring of 1937. The couple’s youngest daughter, Lorraine, was 7 years old when they first moved, and she later described the vitriol and violence her family met in what she called a “hellishly hostile ‘white neighborhood’ in which literally howling mobs surrounded our house.” At one point a mob descended on the home to throw bricks and broken concrete, narrowly missing her head. But not content simply to terrorize the Hansberrys, neighbors then filed a lawsuit, forcing the family to move out, backed by state courts and restrictive covenants. The Hansberrys took the case to the Supreme Court to challenge the restrictive covenants and to return to the house they bought. The case culminated in a 1940 Supreme Court decision that was one of a series of cases that together helped strike a blow against segregation. But the hostility continued. Lorraine Hansberry later recalled being “spat at, cursed and pummeled in the daily trek to and from school. And I also remember my desperate and courageous mother, patrolling our household all night with a loaded German Luger, doggedly guarding her four children, while my father fought the respectable part of the battle in the Washington court.” In 1959, Hansberry’s play A Raisin in the Sun, about a black family on Chicago’s South Side living in dilapidated housing with few better options and at odds over what to do after the death of the patriarch, became the first play written by an African-American woman to be performed on Broadway. The fight by those who migrated and those who marched eventually led to the Fair Housing Act of 1968, which made such discriminatory practices illegal. Carl Hansberry did not live to see it. He died in 1946 at age 50 while in Mexico City, where, disillusioned with the slow speed of progress in America, he was working on plans to move his family to Mexico. ********** The Great Migration laid bare tensions in the North and West that were not as far removed from the South as the people who migrated might have hoped. Martin Luther King Jr., who went north to study in Boston, where he met his wife, Coretta Scott, experienced the depth of Northern resistance to black progress when he was campaigning for fair housing in Chicago decades after the Hansberrys’ fight. He was leading a march in Marquette Park, in 1966, amid fuming crowds. One placard said: “King would look good with a knife in his back.” A protester hurled a stone that hit him in the head. Shaken, he fell to one knee. “I have seen many demonstrations in the South,” he told reporters. “But I have never seen anything so hostile and so hateful as I’ve seen here today.” Out of such turmoil arose a political consciousness in a people who had been excluded from civic life for most of their history. The disaffected children of the Great Migration grew more outspoken about the worsening conditions in their places of refuge. Among them was Malcolm X, born Malcolm Little in 1925 in Omaha, Nebraska, to a lay minister who had journeyed north from Georgia, and a mother born in Grenada. Malcolm was 6 years old when his father, who was under continuous attack by white supremacists for his role fighting for civil rights in the North, died a violent, mysterious death that plunged the family into poverty and dislocation. Despite the upheaval, Malcolm was accomplished in his predominantly white school, but when he shared his dream of becoming a lawyer, a teacher told him that the law was “no realistic goal for a n-----.” He dropped out soon afterward. He would go on to become known as Detroit Red, Malcolm X and el-Hajj Malik el-Shabazz, a journey from militancy to humanitarianism, a voice of the dispossessed and a counterweight to Martin Luther King Jr. during the civil rights movement. At around the same time, a radical movement was brewing on the West Coast. Huey Newton was the impatient son of a preacher and itinerant laborer who left Louisiana with his family for Oakland, after his father was almost lynched for talking back to a white overseer. Huey was a toddler when they arrived in California. There, he struggled in schools ill-equipped to handle the influx of newcomers from the South. He was pulled to the streets and into juvenile crime. It was only after high school that he truly learned to read, but he would go on to earn a PhD. In college he read Malcolm X and met classmate Bobby Seale, with whom, in 1966, he founded the Black Panther Party, built on the ideas of political action first laid out by Stokely Carmichael. The Panthers espoused self-determination, quality housing, health care and full employment for African-Americans. They ran schools and fed the poor. But they would become known for their steadfast and militant belief in the right of African-Americans to defend themselves when under attack, as had been their lot for generations in the Jim Crow South and was increasingly in the North and West. Perhaps few participants of the Great Migration had as deep an impact on activism and social justice without earning the commensurate recognition for her role as Ella Baker. She was born in 1903 in Norfolk, Virginia, to devout and ambitious parents and grew up in North Carolina. After graduating from Shaw University, in Raleigh, she left for New York in 1927. There she worked as a waitress, factory worker and editorial assistant before becoming active in the NAACP, where she eventually rose to national director. Baker became the quiet shepherd of the civil rights movement, working alongside Martin Luther King Jr., Thurgood Marshall and W.E.B. DuBois. She mentored the likes of Stokely Carmichael and Rosa Parks and helped to create the Student Nonviolent Coordinating Committee—the network of college students who risked their lives to integrate buses and register blacks to vote in the most dangerous parts of the South. She helped guide almost every major event in the civil rights era, from the Montgomery bus boycott to the march in Selma to the Freedom Rides and the student sit-ins of the 1960s. Baker was among those who suggested to King, then still in his 20s, that he take the movement beyond Alabama after the success of the bus boycott and press for racial equality throughout the South. She had a keen understanding that a movement would need Southern origins in order for participants not to be dismissed as “Northern agitators.” King was at first reluctant to push his followers in the aftermath of the taxing 381-day boycott, but she believed that momentum was crucial. The modern civil rights movement had begun. Baker devoted her life to working at the ground level in the South to organize the nonviolent demonstrations that helped change the region she had left but not forsaken. She directed students and sharecroppers, ministers and intellectuals, but never lost a fervent belief in the power of ordinary people to change their destiny. “Give light,” she once said, “and people will find the way.” ********** Over time, as the people of the Great Migration embedded themselves in their cities, they aspired to leading roles in civic life. It could not have been imagined in the migration’s early decades that the first black mayors of most major cities in the North and West would not be longtime Northerners, as might have been expected, but rather children of the Great Migration, some having worked the Southern fields themselves. The man who would become the first black mayor of Los Angeles, Tom Bradley, was born on a cotton plantation in Calvert, Texas, to sharecroppers Crenner and Lee Thomas Bradley. The family migrated to Los Angeles when he was 7 years old. Once there his father abandoned the family, and his mother supported him and his four siblings working as a maid. Bradley grew up on Central Avenue among the growing colony of black arrivals from the South. He became a track star at UCLA and later joined the Los Angeles police force, rising to lieutenant, the highest rank allowed African-Americans in the 1950s. Seeing limits on his advancement, he went to law school at night, won a seat on the city council, and was elected mayor in 1973, serving five consecutive terms. His name would become a part of the political lexicon after he ran for governor of California in 1982. Polls had overestimated support for him due to what was believed to be the reluctance of white voters to be truthful with pollsters about their intention to vote for his white opponent, George Deukmejian. To this day, in an election involving a non-white candidate, the discrepancy between polling numbers and final outcomes due to the misleading poll responses of white voters is known as the “Bradley Effect.” In the 1982 election that Bradley had been favored to win, he lost by a single percentage point. Still, he would describe Los Angeles, the place that drew his family out of Texas, as “the city of hope and opportunity.” He said, “I am a living example of that.” ********** The story of African-Americans on this soil cannot be told without the Great Migration. For many of them, the 20th century was largely an era of migrating and marching until freedom, by law and in their hearts, was won. Its mission over, the migration ended in the 1970s, when the South had sufficiently changed so that African-Americans were no longer under pressure to leave and were free to live anywhere they chose. From that time, to the current day, a new narrative took hold in popular thought that has seized primarily on geographical census data, gathered every ten years, showing that since 1975 the South has witnessed a net increase of African-Americans, many drawn (like other Americans) to job opportunities and a lower cost of living, but also to the call of their ancestral homeland, enacting what has come to be called a “reverse migration.” The phrase and phenomenon have captured the attention of demographers and journalists alike who revisit the trend after each new census. One report went so far as to describe it as “an evacuation” from the Northern cities by African-Americans back to the place their forebears had fled. But the demographics are more complex than the narrative often portrayed. While hundreds of thousands of African-Americans have left Northern cities, they have not made a trail to the farms and hamlets where their ancestors may have picked cotton but to the biggest cities of the South—Atlanta, Houston, Dallas—which are now more cosmopolitan and thus more like their Northern counterparts. Many others have not headed South at all but have fanned out to suburbs or smaller cities in the North and West, places like Las Vegas, Columbus, Ohio, or even Ferguson, Missouri. Indeed, in the 40 years since the migration ended, the proportion of the South that is African-American has remained unchanged at about 20 percent—far from the seismic impact of the Great Migration. And so “reverse migration” seems not only an overstatement but misleading, as if relocating to an employer’s Houston office were equivalent to running for one’s life on the Illinois Central. Richard Wright relocated several times in his quest for other suns, fleeing Mississippi for Memphis and Memphis for Chicago and Chicago for New York, where, living in Greenwich Village, barbers refused to serve him and some restaurants refused to seat him. In 1946, near the height of the Great Migration, he came to the disheartening recognition that, wherever he went, he faced hostility. So he went to France. Similarly, African-Americans today must navigate the social fault lines exposed by the Great Migration and the country’s reactions to it: white flight, police brutality, systemic ills flowing from government policy restricting fair access to safe housing and good schools. In recent years, the North, which never had to confront its own injustices, has moved toward a crisis that seems to have reached a boiling point in our current day: a catalog of videotaped assaults and killings of unarmed black people, from Rodney King in Los Angeles in 1991, Eric Garner in New York in 2014, Philando Castile outside St. Paul, Minnesota, this summer, and beyond. Thus the eternal question is: Where can African-Americans go? It is the same question their ancestors asked and answered, only to discover upon arriving that the racial caste system was not Southern but American. And so it was in these places of refuge that Black Lives Matter arose, a largely Northern- and Western-born protest movement against persistent racial discrimination in many forms. It is organic and leaderless like the Great Migration itself, bearing witness to attacks on African-Americans in the unfinished quest for equality. The natural next step in this journey has turned out to be not simply moving to another state or geographic region but moving fully into the mainstream of American life, to be seen in one’s full humanity, to be able to breathe free wherever one lives in America. From this perspective, the Great Migration has no contemporary geographic equivalent because it was not solely about geography. It was about agency for a people who had been denied it, who had geography as the only tool at their disposal. It was an expression of faith, despite the terrors they had survived, that the country whose wealth had been created by their ancestors’ unpaid labor might do right by them. We can no more reverse the Great Migration than unsee a painting by Jacob Lawrence, unhear Prince or Coltrane, erase The Piano Lesson, remove Mae Jemison from her spacesuit in science textbooks, delete Beloved. In a short span of time—in some cases, over the course of a single generation—the people of the Great Migration proved the worldview of the enslavers a lie, that the people who were forced into the field and whipped for learning to read could do far more than pick cotton, scrub floors. Perhaps, deep down, the enslavers always knew that. Perhaps that is one reason they worked so hard at such a brutal system of subjugation. The Great Migration was thus a Declaration of Independence. It moved those who had long been invisible not just out of the South but into the light. And a tornado triggered by the wings of a sea gull can never be unwound. Isabel Wilkerson is a former Chicago bureau chief for The New York Times and a Pulitzer Prize winner. She is the author of the best-selling The Warmth of Other Suns: The Epic Story of America’s Great Migration.