id
stringlengths
32
32
url
stringlengths
31
1.58k
title
stringlengths
0
1.02k
contents
stringlengths
92
1.17M
9f31ac271d9fb456d35376d2ff4bcfac
https://www.smithsonianmag.com/history/1948-democratic-convention-878284/
1948 Democratic Convention
1948 Democratic Convention The Democrats came to Philadelphia on July 12, seventeen days after the Republicans, meeting in the same city, had nominated a dream ticket of two hugely popular governors: Thomas E. Dewey of New York for president and Earl Warren of California for vice president. [×] CLOSE Video: Political Props The Democrats' man, President Harry S. Truman, had labored for more than three years in the enormous shadow of Franklin D. Roosevelt. In their hearts, all but the most optimistic delegates thought, as Clare Boothe Luce had told the Republican gathering, that the president was a "gone goose." Truman, a failed haberdasher turned politician, had the appearance of a meek bookkeeper. In fact, he was feisty and prone to occasional angry outbursts. His upper-South twang did not resonate with much of the country. His many detractors wrote him off as a "little man" who had been unable to deal with difficult post-World War II issues—inflation and consumer shortages, civil rights for African-Americans and a developing cold war with the Soviet Union. In the off-year elections of 1946, Republicans had gained firm control of both houses of Congress for the first time since 1928. Few Democrats believed Truman could lead them to victory in the presidential race. A large group of cold war liberals—many of them organized in the new Americans for Democratic Action (ADA)—joined with other Democratic leaders in an attempt to draft America's greatest living hero, Gen. Dwight D. Eisenhower, as their candidate. The general seemed momentarily persuadable, then quickly backed away. It was no coincidence that both parties met in Philadelphia. The city was at the center point of the Boston to Richmond coaxial cable, then the main carrier of live television in the United States. By 1948, as many as ten million people from Boston to Richmond could watch the tumultuous process by which the major parties selected their candidates. They could also see star journalists whom they had known only as voices, most notably the CBS team of Edward R. Murrow, Quincy Howe and Douglas Edwards. The parties met amid miles of media cable and wiring in Convention Hall, an imposing Art Deco arena decorated with exterior friezes that celebrated American values and the history of humankind. The structure could accommodate 12,000 people. Packed to the rafters on a steamy July day, heated by blazing television lights and possessing no effective cooling system, the great hall was like an enormous sauna. The Democrats' keynote speaker was Senator Alben Barkley of Kentucky. A presence on Capitol Hill since 1912 and the Democratic leader in the upper house for more than a decade, Barkley was much liked throughout the party and a master orator in the grand tradition. His speech scourged the Republican-controlled Congress, quoted patron saints of the Democratic Party from Jefferson to FDR, expropriated Lincoln along the way and cited biblical text from the Book of Revelation. The delegates cheered themselves hoarse, and an ensuing demonstration waved "Barkley for Vice President" placards. Truman, watching the proceedings on TV in Washington, was not amused. He considered "old man Barkley" (at age 70, six and a half years his senior) to be little more than a hail fellow with whom one sipped bourbon and swapped tall tales. The president wanted a young, dynamic and aggressively liberal running mate. He already had offered the slot to Supreme Court Justice William O. Douglas, who declined. With no backup, Truman turned to Barkley: "Why didn't you tell me you wanted to run, Alben? That's all you had to do." Barkley accepted. By then, the attention of the delegates had shifted to a platform fight that marked the full emergence of the modern Democratic Party. African-Americans were an important Democratic constituency, but so were white Southerners. Previous party platforms had never gotten beyond bland generalizations about equal rights for all. Truman was prepared to accept another such document, but liberals, led by the ADA, wanted to commit the party to four specific points in the president's own civil rights program: abolition of state poll taxes in federal elections, an anti-lynching law, a permanent fair employment practices committee and desegregation of the armed forces. Hubert Humphrey, mayor of Minneapolis and a candidate for Senate, delivered the liberal argument in an intensely emotional speech: "The time is now arrived in America for the Democratic Party to get out of the shadow of states' rights and walk forthrightly into the bright sunshine of human rights." On July 14, the last day of the convention, the liberals won a close vote. The entire Mississippi delegation and half the Alabama contingent walked out of the convention. The rest of the South would back Senator Richard B. Russell of Georgia as a protest candidate against Truman for the presidential nomination. Nearly two weeks after the convention, the president issued executive orders mandating equal opportunity in the armed forces and in the federal civil service. Outraged segregationists moved ahead with the formation of a States' Rights ("Dixiecrat") Party with Gov. Strom Thurmond of South Carolina as its presidential candidate. The States' Rights Party avoided outright race baiting, but everyone understood that it was motivated by more than abstract constitutional principles. Truman was slated to deliver his acceptance speech at 10 p.m. on July 14 but arrived to find the gathering hopelessly behind schedule. As he waited, nominating speeches and roll calls droned on and on. Finally, at 2 a.m. he stepped up to the podium. Most of America was sound asleep. He wore a white linen suit and dark tie, ideal for the stifling hall and the rudimentary capabilities of 1948 television. His speech sounded almost spit into the ether at the opposition. "Senator Barkley and I will win this election and make these Republicans like it—don't you forget that!" He announced he would call Congress back into session on July 26—Turnip Day to Missouri farmers—and dare it to pass all the liberal-sounding legislation endorsed in the Republican platform. "The battle lines of 1948 are the same as they were in 1932," he declared, "when the nation lay prostrate and helpless as a result of Republican misrule and inaction." New York Times radio and TV critic Jack Gould judged it perhaps the best performance of Truman's presidency: "He was relaxed and supremely confident, swaying on the balls of his feet with almost a methodical rhythm." The delegates loved it. Truman's tireless campaigning that fall culminated in a feel-good victory of a little guy over an organization man. It especially seemed to revitalize the liberals, for whom the platform fight in Philadelphia became a legendary turning point. "We tied civil rights to the masthead of the Democratic Party forever," remarked ADA activist Joseph Rauh 40 years later. In truth, the ramifications of that victory would require two decades to play out. In the meantime, Thurmond, winning four states and 39 electoral votes, had fired a telling shot across the Democrats' bow. Dixiecrat insurgents in Congress returned to their seats in 1949 with no penalty from their Democratic colleagues. Party leaders, North and South, understood the danger of a spreading revolt. Truman would not backtrack on his commitment to civil rights, but neither would Congress give him the civil rights legislation he requested. His successors as party leader would show little disposition to push civil rights until the mass protests led by Martin Luther King Jr. forced the hands of John F. Kennedy and Lyndon B. Johnson. Only then would the ultimate threat of the Dixiecrats be realized—the movement of the white South into the Republican Party. Alonzo L. Hamby, a professor of history at Ohio University, wrote Man of the People: A Life of Harry S. Truman.
1958ac086be8b7252218a4909c656e20
https://www.smithsonianmag.com/history/1950s-tv-show-set-stage-modern-distance-learning-180976734/
The 1950s TV Show That Set the Stage for Today’s Distance Learning
The 1950s TV Show That Set the Stage for Today’s Distance Learning In May 1962, the New York Times profiled Cora Gay Carr, a 37-year-old housewife and mother of two who was set to receive her Bachelor of Arts in English from New York University (NYU). An impressive accomplishment in and of itself, Carr’s graduation made headlines because she’d earned 54 of the 128 required credits by watching a television show: “Sunrise Semester.” Launched in 1957, when NYU partnered with WCBS-TV to produce the series, “Sunrise Semester” broadcast lectures from NYU faculty to the general public. (Viewers who wanted to actually receive college credit had to pay a fee and complete additional coursework.) At the peak of its 25-year run, the show attracted an audience of around two million viewers, in addition to garnering multiple local Emmy Awards. Carr initially registered for “Sunrise Semester” purely for the intellectual stimulation it offered. But tuning in regularly inspired her to return to school, and she became what’s known today as a “hybrid” student, taking courses both from home and in person on NYU’s campus. “The whole concept of doing this sort of thing on TV is wonderful for someone like myself, who never would have thought of going back to college,” Carr told the Times. Sixty-four years after “Sunrise Semester” premiered, distance learning has become the new normal, with schools of all levels attempting to mitigate the spread of Covid-19 by transitioning to online platforms like Zoom and Google Classroom. “These are very, very parallel moments,” says Rosanna Flouty, an NYU scholar of museum studies who wrote her dissertation on the television series’ place in distance learning. “Sunrise Semester” was so named because of its early time slot: 6:30 to 7:00 a.m. One of the two courses offered each semester held lectures on Monday, Wednesday and Friday, while the other was slated for Tuesday, Thursday and Saturday. Initially, the idea of a program that asked viewers to wake up at dawn to watch lectures was met with incredulity. Detractors doubted that such an early slot in the broadcasting schedule would be able to draw viewers; one critic lambasted the notion of a $75 course (almost $700 in 2021) without professor-student interactions as a “fraud.” But “Sunrise Semester” defied expectations, drawing 74,000 viewers and 177 enrolled students in its first week alone. By the end of the semester, 120,000 Americans were rising early regularly to watch professor Floyd Zulli, a charismatic scholar of romance languages, teach the show’s inaugural course, an introductory class on comparative literature. The Red and the Black, an 1830 French novel and the first book on the course’s reading list, reportedly sold out at almost every bookstore in a 30-mile radius of New York City, prompting Random House to issue a reprint, writes Steven D. Krause in More Than a Moment: Contextualizing the Past, Present, and Future. Variety deemed the program “the first unquestioned hit show of the 1957 television season.” Fan mail sent in from viewers across North America echoed this enthusiasm, with one New Yorker saying, “I don’t usually write fan letters. But if you’ve got a fan club, I’ve just joined it.” Courses on offer ranged from art history to philosophy, math and astronomy. During the show’s 13th season, in fall 1976, a class titled “Communication: The Invisible Environment” aimed to show “how, on an unprecedented scale, our lives are being changed by new media and technology,” according to an NYU promotional brochure. The pamphlet added that “the natural environment … recedes in the face of ‘mediated’ environments which increasingly govern our way of seeing, knowing and valuing”—an idea that continues to resound today. Other classes aired between 1957 and 1982 included “The World of Islam,” “The Meaning of Death” and “A History of African Civilization.” Viewers could take a course—consisting of half-hour lectures, a term paper, two mail-in assignments and a final exam—for three points of undergraduate credit from NYU or (eventually) other universities. Classes had in-person components, too: Finals took place on NYU’s campus, and in the show’s earlier years, the school invited students who had completed courses to meet their professors at a gala. For those who had tuned in to lectures but didn’t want to complete additional coursework, the university offered a $35 certificate of completion. Overall, casual viewers made up the vast majority of audience members. Few demographic analyses of “Sunrise Semester”’s viewership exist, but a study conducted between 1958 and 1959 revealed that, on average, students who formally registered for credit or a certificate had been out of school for 11 years. Another study conducted by NYU found that the show’s audience during its first year was 70 percent female and 30 percent male. Flouty’s analysis of fan mail revealed a similarly woman-heavy audience, with many viewers identifying themselves as housewives whose children had left the nest. Flouty theorizes that individuals—especially women—who were unable to pursue higher education in the post-Great Depression era of the 1950s were forced to seek unconventional alternatives like “Sunrise Semester.” The show’s early time slot enabled housewives in particular to fit in learning before housework duties. “I have this suspicion that there's something very empowering about that moment and women being able to be free from the household chores, maybe existing in this empty nest, so that it's their own time,” Flouty says. Writing in her 2016 dissertation, the scholar added, “’Sunrise Semester’ sought to mimic a liberal arts education, which was possibly what many of these women gave up during World War II or during the Great Depression.” Though enrollment in higher education tends to rise during economic recessions, with individuals seeking ways to improve their job prospects, a 1932 study found that in 1930, the first year of the Great Depression, women’s enrollment actually fell. The study, which examined the period of 1860 to 1930, concluded that women were more likely to drop out or postpone their studies due to “difficult family financial situations”—in other words, men often had more savings and were better equipped to cover college expenses. Following World War II, wrote scholar Patsy Parker in a 2016 study, women were released from their wartime jobs at a 75 percent higher rate than men. This mass exodus from the workforce, in combination with a growing apathy, and even hostility, toward women on college campuses, left many with limited options beyond the domestic sphere. As a professor who has herself been teaching online classes during the Covid-19 pandemic, Flouty says she expects to see a similar set of circumstances arise in the coming years. “The reason that [the women] would have stepped away from their college dreams in the [19]20s would have been there was no money to send anyone to college, and I think we're going to have a similar effect now,” she explains. “It has definitely occurred to me how much more poignant the story is now that we’re actually looking at the economic freefall against the backdrop of a virus that remains unchecked and unsolved.” (Last year, the pandemic took an unprecedented toll on working women: In October, the National Women’s Law Center reported that 80 percent of the 1.1 million Americans who dropped out of the labor force between August and September were women. Many of these individuals were laid off from female-dominated fields like hospitality and education or found themselves unable to continue working while assuming a disproportionate share of responsibilities at home.) “Sunrise Semester”’s eventual demise came down to finances. As a highly unprofitable venture, the show faced increasing pressure to monetize. Though plans existed to improve the low-budget series’ production value, CBS ultimately canceled the show to make space for early-morning news, a more commercially viable option. Very few episodes of “Sunrise Semester” are available today. According to Flouty, new lectures were taped over older recordings that had already aired. But while the show has largely fallen out of public memory, its legacy is apparent today in the form of massive open online courses. Better known as MOOCs, these free classes adapt the concept of “Sunrise Semester”—repurposing the most prevalent technology of the day to provide accessible education for learners of all walks of life—for the digital age. The popular MOOC search site Class Central estimates that 120 million people worldwide enrolled in the courses in 2019. These numbers have surged amid the Covid-19 pandemic: Coursera, edX and FutureLearn, the three most popular MOOC providers, saw as many new users register in April 2020 as in the whole of 2019, according to EdSurge. Perhaps if Cora Gay Carr were alive today, she, too, would have enrolled in MOOCs. Her success story, despite taking place a half-century ago, holds particular resonance in this time of remote learning: After earning her bachelor’s degree from NYU, Carr went on to earn a Master’s degree in English. She later returned to her alma mater—this time, as a professor. Tara Wu is an editorial intern with Smithsonian magazine. She is a senior at Northwestern University, where she will major in journalism and environmental science.
3ef0f98f79cc7b53e01b784cda8fc777
https://www.smithsonianmag.com/history/1954-flying-car-for-sale-1281786/
1954 Flying Car for Sale
1954 Flying Car for Sale Ever dreamed of owning your own flying car… from the 1950s? If you happen to have $1.25 million lying around, you can make that happen! It seems every year we see companies like Terrafugia and Moller promise that the flying car will soon be an everyday reality. But people often forget flying cars have been around for over half a century. Greg Herrick, an aircraft collector in Minneapolis, is selling his 1954 Taylor Aerocar N-101D with an asking price of $1.25 million. His flying car of the retro-future sports a yellow and black body and as you can see from the photo above, still works! Herrick has over 40 aircraft in his private collection and the Aerocar was one of the first he ever purchased. He bought the flying car in the early 1990s from a man in Idaho and says he was drawn to the Aerocar just as many people in the latter half of the 20th century were. ”I was just at the tail end of that generation that kind of grew up with that dream of… well, I guess every generation has had that dream since the automobile — of a flying car,” Herrick told me. The Aerocar was designed by Moulton Taylor in 1949 and only five were ever produced. In order to take flight the Aerocar must be converted into an aircraft with wings that fold forward. Though it looks cumbersome, the vehicle was marketed in the early 1950s as being so effortless that a woman could do it “without soiling her gloves.” The video below is a newsreel about the Aerocar from November 5, 1951. Herrick’s Aerocar was first listed for sale in December 2011. His most recent listing includes some of the specs: The AEROCAR features side-by-side seating for two. Advanced for its time, most of the fuselage skin is of composite material and the car is front wheel drive. In flight the wings are high and unobtrusive. Powered by a Lycoming O-320 Engine the propeller is mounted at the end of a long tail cone, the latter angled up for propeller clearance.  Cruise speed is about 100 mph. Takeoff speed in 55 mph and the airplane is controlled by the same steering wheel as is used for driving. But why sell it? ”I like rarity. I like unusual things,” Herrick tells me. “I like things that represent progress or tell a story. But as time passes your tastes start to become more refined. And no matter what it is you’re doing you can’t collect everything and you can’t be an expert in every area. So my interests began to migrate toward the golden age of aviation between the wars — in particular the aircraft that were almost lost to history. So this airplane is kind of superfulous to my needs.” But if you’re thinking about buying this blast from the past don’t forget that you’ll need two kinds of insurance! “When I bought the thing, I was looking at the insurance and I had to have two different insurance policies: an aviation policy and then I had to get an auto policy,” Herrick said. Making sure you have two kinds of insurance is certainly one of those realities that The Jetsons never warned us about. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
439b0bac9919f584fc25e6df25b51848
https://www.smithsonianmag.com/history/1964-republican-convention-revolution-from-the-right-915921/
How the 1964 Republican Convention Sparked a Revolution From the Right
How the 1964 Republican Convention Sparked a Revolution From the Right There were only three small elevators at the Mark Hopkins, the splendid old San Francisco hotel that served as headquarters for contenders Barry Goldwater and William Scranton during the 1964 Republican National Convention. The wait that hot July week could stretch to 45 minutes. The day Goldwater was to accept the nomination at the Cow Palace in nearby Daly City, he caught a service elevator in the hotel kitchen. [×] CLOSE Video: Political Props That was where a reporter cornered the Arizona senator and asked him whether the Democrats would campaign on the fact that nearly 70 percent of the convention delegates, acting on his campaign's instructions, had voted down a platform plank affirming the constitutionality of the recently passed Civil Rights Act. "After Lyndon Johnson—the biggest faker in the United States? He opposed civil rights until this year. Let them make an issue of it," Goldwater snapped back. "He's the phoniest individual who ever came around." Goldwater's tone reflected the tenor of this ugliest of Republican conventions since 1912, as entrenched moderates faced off against conservative insurgents. In an era in which a national consensus seemed to have coalesced around advancing civil rights, containing Communism and expanding government, the moderates believed they had to win to preserve the Republican Party. The conservatives—who wanted to contain the role of the federal government and roll back Communism—believed they were saving not just the party but Western civilization. The logy Mark Hopkins elevators gave the insurgents, flooding into town for what Goldwater biographer Robert Alan Goldberg called the "Woodstock of the right," at least two chances a day to bait Chet Huntley and David Brinkley, anchors of NBC's nightly newscast—and crypto-liberals, according to their harassers. "You know, these nighttime news shows sound to me like they're being broadcast from Moscow," one conservative observed to another on the way down, loud enough for the two newsmen to hear. Brinkley forbade his son, Alan, to show his NBC insignia, except to security. The volume of right-wing rage at the media was novel at this Republican convention. Unprecedented, too, was the attention focused on the issue of television coverage. The convention was the first since CBS and NBC had expanded their nightly newscasts from 15 minutes to 30 minutes, and the first since the assassination and funeral of President John F. Kennedy redefined the bond between television and politics. In 1960, there were about as many journalists, both print and broadcast, as delegates. Four years later, broadcasters alone outnumbered delegates two to one. As it happened, Alan Brinkley grew up to become one of the most distinguished historians of 20th-century American politics. He has written of the 1964 conventions, Republican and Democratic, as transitional—managed by politicians who were accustomed to backroom deal-making and high-pressure crowd tactics and were caught up short to learn that they were suddenly in the business of producing a TV show. And what a show the GOP convention was! Conservatives from the West, the South and the Midwest were convinced that the only way moderate "Wall Street Republicans" had been able to run away with the presidential nomination every four years was that "a few secret kingmakers in New York" conspired to steal it, as Illinois activist Phyllis Schlafly put it in a self-published book, A Choice Not an Echo, several hundred thousand copies of which were distributed in the summer of 1964. (Some convention delegates reported receiving more than 60 copies in the mail.) They weren't going to let it be stolen this time. Goldwater's finance chairman, Bill Middendorf, warned campaign aide Dean Burch that "the 1952 tricks will be used again": planted stories, whispering campaigns, threats, cajolery and the "shanghaiing and spiriting of delegates and alternates to distant points." Goldwater delegates were warned to be on the lookout "for unexpectedly easy companionship from new-found female friends." They were to contact the Goldwater headquarters on the 15th floor of the Mark Hopkins immediately after landing at the airport and to travel around town in pairs along pre-timed routes in radio-equipped cars. They used walkie-talkies only as back-ups, because these could be too easily tapped into—as, indeed, they had tapped into Scranton's. Bill Scranton, whose patrician family ran the Pennsylvania coal town that bore his name, seemed to comedian Dick Gregory like "the guy who runs to John Wayne for help." (Goldwater looked like a cowboy.) Scranton had entered the race as a last-minute act of noblesse oblige. "Today the nation—and indeed the world—waits to see if another proud political banner will falter, grow limp and collapse in the dust," he had said as he announced his candidacy just four weeks before the convention. "Lincoln would cry out in pain if we sold out our principles." According to a Harris Poll taken late that June, 62 percent of rank and file Republicans preferred Scranton to Goldwater, but the supposed Wall Street kingmakers were in dithering disarray. ("What in God's name has happened to the Republican Party!" muttered Henry Cabot Lodge —the party's 1960 vice presidential nominee—as he paged through the delegate list in his hotel room. "I hardly know any of these people!") The moderates' strategy was to put the Goldwaterites' perceived extremism on televised display, hoping delegates would flock to Scranton after being flooded by telegrams from outraged voters watching at home. The moderates circulated a translation of an interview Goldwater had given to a German newsmagazine, in which he was quoted as saying he would tell his generals in Vietnam, "Fellows, we made the decision to win, now it's your problem." CBS correspondent Daniel Schorr then reported, "It is now clear that Senator Goldwater's interview with Der Spiegel with its hard line appealing to right-wing elements in Germany was only the start of a move to link up with his opposite numbers in Germany," with Schorr basing his assertion simply on the fact that Goldwater would be vacationing after the convention at an American military installation that was, coincidentally, in the former Nazi stronghold of Bavaria. (Schorr later said he did not mean to suggest "a conscious effort" by Goldwater to connect with the German right.) Schorr's report only stirred the hornet's nest: the delegates who had trooped to the conservative Woodstock to nominate Goldwater greeted calls that they abandon him with angry defiance, and their loyalty put their candidate over the top. When Nelson Rockefeller, speaking to the assembled, advocated a platform plank denouncing extremism, galleries full of exuberant conservatives booed him. In his acceptance speech, Goldwater capped off the snub by lustily and defiantly proclaiming: "Extremism in the defense of liberty is no vice. And...moderation in the pursuit of justice is no virtue!" He raised the rafters. The "stench of fascism is in the air," Pat Brown, California's liberal Democratic governor, told the press. His view was widely shared. The political world's near unanimous judgment was that Goldwater's landslide loss to LBJ that November was a disaster for all Republicans, not just conservative Republicans. But Bill Middendorf would more accurately call his memoir of that year A Glorious Disaster. Out of its ashes and out of the fervent grassroots organizing that delivered Goldwater his unlikely nomination emerged a Republican Party surer of its identity and better positioned to harvest the bounty—particularly in the South—when the American mood shifted to the right during the cacophonous years that followed. Rick Perlstein is the author, most recently, of Nixonland: The Rise of a President and the Fracturing of America.
6757e7076993f30d9e9b2eee2743ea64
https://www.smithsonianmag.com/history/1968-democratic-convention-931079/
1968 Democratic Convention
1968 Democratic Convention As delegates arrived in Chicago the last week of August 1968 for the 35th Democratic National Convention, they found that Mayor Richard J. Daley, second only to President Lyndon B. Johnson in political influence, had lined the avenues leading to the convention center with posters of trilling birds and blooming flowers. Along with these pleasing pictures, he had ordered new redwood fences installed to screen the squalid lots of the aromatic stockyards adjoining the convention site. At the International Amphitheatre, conventioneers found that the main doors, modeled after a White House portico, had been bulletproofed. The hall itself was surrounded by a steel fence topped with barbed wire. Inside the fence, clusters of armed and helmeted police mingled with security guards and dark-suited agents of the Secret Service. At the apex of the stone gates through which all had to enter was a huge sign bearing the unintentionally ironic words, "HELLO DEMOCRATS! WELCOME TO CHICAGO." [×] CLOSE Video: Political Props If this Potemkin village setting weren't enough to intensify anxiety among Democrats gathering to nominate their presidential candidate, the very elements and conditions of Chicago life contributed to a sense of impending disaster. The weather was oppressively hot and humid. The air conditioning, the elevators and the phones were operating erratically. Taxis weren't operating at all because the drivers had called a strike before the convention began. The National Guard had been mobilized and ordered to shoot to kill, if necessary. Even as delegates began entering this encampment, an army of protesters from across the country flowed into the city, camping in parks and filling churches, coffee shops, homes and storefront offices. They were a hybrid group—radicals, hippies, yippies, moderates—representing myriad issues and a wide range of philosophies, but they were united behind an encompassing cause: ending the long war in Vietnam and challenging Democratic Party leaders and their delegates to break with the past, create change—yes, that was the term then on every protester's lips—and remake the battered U.S. political system. As Rennie Davis put it, speaking as project director for the National Mobilization Committee to End the War in Vietnam, the largest and most important group for the planned protests: "Many of our people have already gone beyond the traditional electoral processes to achieve change. We think that the energies released...are creating a new constituency for America. Many people are coming to Chicago with a sense of new urgency, and a new approach." What followed was worse than even the most dire pessimist could have envisioned. The 1968 Chicago convention became a lacerating event, a distillation of a year of heartbreak, assassinations, riots and a breakdown in law and order that made it seem as if the country were coming apart. In its psychic impact, and its long-term political consequences, it eclipsed any other such convention in American history, destroying faith in politicians, in the political system, in the country and in its institutions. No one who was there, or who watched it on television, could escape the memory of what took place before their eyes. Include me in that group, for I was an eyewitness to those scenes: inside the convention hall, with daily shouting matches between red-faced delegates and party leaders often lasting until 3 o'clock in the morning; outside in the violence that descended after Chicago police officers took off their badges and waded into the chanting crowds of protesters to club them to the ground. I can still recall the choking feeling from the tear gas hurled by police amid throngs of protesters gathering in parks and hotel lobbies. For Democrats in particular, Chicago was a disaster. It left the party with scars that last to this day, when they meet in a national convention amid evidence of internal divisions unmatched since 1968. To understand the dimensions of the Democrats' calamity, recall that in 1964, Lyndon B. Johnson had defeated Barry Goldwater for the presidency with 61.1 percent of the popular vote, a margin eclipsing even the greatest previous electoral victory, by Franklin D. Roosevelt over Alf Landon in 1936. In mid-1964, passage of civil rights legislation had virtually ended legal segregation in America. Optimists had begun talking about America's entering a "golden age." By that next summer, however, the common cause of blacks and whites marching together had been shattered as riots swept the Watts section of Los Angeles and, over the next two years, cities across the country. In that same initially hopeful year, the Johnson administration had made a fateful commitment to keep increasing the numbers of troops to fight a ground war in Vietnam, an escalation that would spawn wave upon wave of protest. In the 1966 congressional elections, Democrats—who had been experiencing the greatest electoral majorities since the New Deal—sustained severe defeats. As 1968 began, greater shocks awaited the nation: North Vietnamese forces launched the Tet offensive that January, rocking U.S. troops and shattering any notion that the war was nearly won. Johnson withdrew from the presidential campaign that March. Martin Luther King Jr. was assassinated in Memphis in early April, and another succession of riots swept the cities. Robert F. Kennedy, heir to the Kennedy legacy, had his presidential campaign cut down by an assassin's bullet after winning the critical California primary in June. It was against this extraordinarily emotional background that the Democrats convened. Hubert H. Humph- rey, LBJ's vice president, had sat out the primaries but secured delegates controlled by the party establishment. Senator Eugene McCarthy—the antiwar candidate whose strong second-place showing in the New Hampshire primary had demonstrated Johnson's vulnerability—had abundant forces in the hall, but they were now relegated to the role of protesters. Senator George S. McGovern had rallied what remained of Kennedy's forces, but he, too, knew he led a group whose hopes had been extinguished. From whatever political perspective—party regulars, irregulars or reformers—they all shared an abiding pessimism over their prospects against a Republican Party that had coalesced behind Richard M. Nixon. They gave voice to their various frustrations in the International Amphitheatre during bitter, often profane, floor fights over antiwar resolutions. The eventual nomination of Humphrey, perceived heir to Johnson's war policies, compounded the sense of betrayal among those who opposed the war. The bosses, not the people who voted in the primaries, had won. The violence that rent the convention throughout that week, much of it captured live on television, confirmed both the Democrats' pessimism and the country's judgment of a political party torn by dissension and disunity. In November the party would lose the White House to Nixon's law-and-order campaign. In the nine presidential elections since, Democrats have won only three, and only once—in 1976, after the Watergate scandal forced Nixon to resign in disgrace—did they take, barely, more than 50 percent of the votes. Changes in party rules have curtailed the establishment's power to anoint a presidential nominee, but the ideological divides have persisted; thus this year's rival candidates battled bitterly to win state primaries. And after such a divisive primary season, in the end the nomination still depended on the "superdelegates" that replaced the party bosses. One 1968 memory remains indelible 40 years later. Throughout that week I had been a guest commentator on NBC's "Today" show, broadcasting live from Chicago. Early Friday morning, a few hours after the convention ended, I took the elevator to the lobby of the Conrad Hilton Hotel, where I had been staying, to head for the studio. As the elevator doors opened, I saw huddled before me a group of young McCarthy volunteers. They had been bludgeoned by Chicago police, and sat there with their arms around each other and their backs against the wall, bloody and sobbing, consoling one another. I don't know what I said on the "Today" show that morning. I do remember that I was filled with a furious rage. Just thinking of it now makes me angry all over again. Haynes Johnson, who has written 14 books, covered the 1968 Democratic National Convention for the Washington Star.
a6198ec41f79adf1aa1cae3f6039408f
https://www.smithsonianmag.com/history/1968-democratic-convention-931079/?sessionGUID=21aa9062-e839-99a5-738a-3d0a10890162&no-ist=&page=1
1968 Democratic Convention
1968 Democratic Convention As delegates arrived in Chicago the last week of August 1968 for the 35th Democratic National Convention, they found that Mayor Richard J. Daley, second only to President Lyndon B. Johnson in political influence, had lined the avenues leading to the convention center with posters of trilling birds and blooming flowers. Along with these pleasing pictures, he had ordered new redwood fences installed to screen the squalid lots of the aromatic stockyards adjoining the convention site. At the International Amphitheatre, conventioneers found that the main doors, modeled after a White House portico, had been bulletproofed. The hall itself was surrounded by a steel fence topped with barbed wire. Inside the fence, clusters of armed and helmeted police mingled with security guards and dark-suited agents of the Secret Service. At the apex of the stone gates through which all had to enter was a huge sign bearing the unintentionally ironic words, "HELLO DEMOCRATS! WELCOME TO CHICAGO." [×] CLOSE Video: Political Props If this Potemkin village setting weren't enough to intensify anxiety among Democrats gathering to nominate their presidential candidate, the very elements and conditions of Chicago life contributed to a sense of impending disaster. The weather was oppressively hot and humid. The air conditioning, the elevators and the phones were operating erratically. Taxis weren't operating at all because the drivers had called a strike before the convention began. The National Guard had been mobilized and ordered to shoot to kill, if necessary. Even as delegates began entering this encampment, an army of protesters from across the country flowed into the city, camping in parks and filling churches, coffee shops, homes and storefront offices. They were a hybrid group—radicals, hippies, yippies, moderates—representing myriad issues and a wide range of philosophies, but they were united behind an encompassing cause: ending the long war in Vietnam and challenging Democratic Party leaders and their delegates to break with the past, create change—yes, that was the term then on every protester's lips—and remake the battered U.S. political system. As Rennie Davis put it, speaking as project director for the National Mobilization Committee to End the War in Vietnam, the largest and most important group for the planned protests: "Many of our people have already gone beyond the traditional electoral processes to achieve change. We think that the energies released...are creating a new constituency for America. Many people are coming to Chicago with a sense of new urgency, and a new approach." What followed was worse than even the most dire pessimist could have envisioned. The 1968 Chicago convention became a lacerating event, a distillation of a year of heartbreak, assassinations, riots and a breakdown in law and order that made it seem as if the country were coming apart. In its psychic impact, and its long-term political consequences, it eclipsed any other such convention in American history, destroying faith in politicians, in the political system, in the country and in its institutions. No one who was there, or who watched it on television, could escape the memory of what took place before their eyes. Include me in that group, for I was an eyewitness to those scenes: inside the convention hall, with daily shouting matches between red-faced delegates and party leaders often lasting until 3 o'clock in the morning; outside in the violence that descended after Chicago police officers took off their badges and waded into the chanting crowds of protesters to club them to the ground. I can still recall the choking feeling from the tear gas hurled by police amid throngs of protesters gathering in parks and hotel lobbies. For Democrats in particular, Chicago was a disaster. It left the party with scars that last to this day, when they meet in a national convention amid evidence of internal divisions unmatched since 1968. To understand the dimensions of the Democrats' calamity, recall that in 1964, Lyndon B. Johnson had defeated Barry Goldwater for the presidency with 61.1 percent of the popular vote, a margin eclipsing even the greatest previous electoral victory, by Franklin D. Roosevelt over Alf Landon in 1936. In mid-1964, passage of civil rights legislation had virtually ended legal segregation in America. Optimists had begun talking about America's entering a "golden age." By that next summer, however, the common cause of blacks and whites marching together had been shattered as riots swept the Watts section of Los Angeles and, over the next two years, cities across the country. In that same initially hopeful year, the Johnson administration had made a fateful commitment to keep increasing the numbers of troops to fight a ground war in Vietnam, an escalation that would spawn wave upon wave of protest. In the 1966 congressional elections, Democrats—who had been experiencing the greatest electoral majorities since the New Deal—sustained severe defeats. As 1968 began, greater shocks awaited the nation: North Vietnamese forces launched the Tet offensive that January, rocking U.S. troops and shattering any notion that the war was nearly won. Johnson withdrew from the presidential campaign that March. Martin Luther King Jr. was assassinated in Memphis in early April, and another succession of riots swept the cities. Robert F. Kennedy, heir to the Kennedy legacy, had his presidential campaign cut down by an assassin's bullet after winning the critical California primary in June. It was against this extraordinarily emotional background that the Democrats convened. Hubert H. Humph- rey, LBJ's vice president, had sat out the primaries but secured delegates controlled by the party establishment. Senator Eugene McCarthy—the antiwar candidate whose strong second-place showing in the New Hampshire primary had demonstrated Johnson's vulnerability—had abundant forces in the hall, but they were now relegated to the role of protesters. Senator George S. McGovern had rallied what remained of Kennedy's forces, but he, too, knew he led a group whose hopes had been extinguished. From whatever political perspective—party regulars, irregulars or reformers—they all shared an abiding pessimism over their prospects against a Republican Party that had coalesced behind Richard M. Nixon. They gave voice to their various frustrations in the International Amphitheatre during bitter, often profane, floor fights over antiwar resolutions. The eventual nomination of Humphrey, perceived heir to Johnson's war policies, compounded the sense of betrayal among those who opposed the war. The bosses, not the people who voted in the primaries, had won. The violence that rent the convention throughout that week, much of it captured live on television, confirmed both the Democrats' pessimism and the country's judgment of a political party torn by dissension and disunity. In November the party would lose the White House to Nixon's law-and-order campaign. In the nine presidential elections since, Democrats have won only three, and only once—in 1976, after the Watergate scandal forced Nixon to resign in disgrace—did they take, barely, more than 50 percent of the votes. Changes in party rules have curtailed the establishment's power to anoint a presidential nominee, but the ideological divides have persisted; thus this year's rival candidates battled bitterly to win state primaries. And after such a divisive primary season, in the end the nomination still depended on the "superdelegates" that replaced the party bosses. One 1968 memory remains indelible 40 years later. Throughout that week I had been a guest commentator on NBC's "Today" show, broadcasting live from Chicago. Early Friday morning, a few hours after the convention ended, I took the elevator to the lobby of the Conrad Hilton Hotel, where I had been staying, to head for the studio. As the elevator doors opened, I saw huddled before me a group of young McCarthy volunteers. They had been bludgeoned by Chicago police, and sat there with their arms around each other and their backs against the wall, bloody and sobbing, consoling one another. I don't know what I said on the "Today" show that morning. I do remember that I was filled with a furious rage. Just thinking of it now makes me angry all over again. Haynes Johnson, who has written 14 books, covered the 1968 Democratic National Convention for the Washington Star.
c4d2b84ea8d583300c85cbd1a04db2e6
https://www.smithsonianmag.com/history/1968-three-students-were-killed-police-today-few-remember-orangeburg-massacre-180968092/
In 1968, Three Students Were Killed by Police. Today, Few Remember the Orangeburg Massacre
In 1968, Three Students Were Killed by Police. Today, Few Remember the Orangeburg Massacre Recalling the event decades later, Robert Lee Davis remembered the chaotic noise and fear that permeated the night of February 8, 1968. “Students were hollering, yelling and running,” Davis said. “I went into a slope near the front end of the campus and I kneeled down. I got up to run, and I took one step; that’s all I can remember. I got hit in the back.” He was among the 28 students of South Carolina State College injured that day in the Orangeburg Massacre; his friend, freshman Samuel Hammond, who had also been shot in the back, died of his wounds. Later that night, Delano Middleton and Henry Smith would also die; all three killed by the police were only 18 years old. Despite being the first deadly confrontation between university students and law enforcement in United States history, the Orangeburg Massacre is a rarely remembered tragedy. Occurring two years before the better-known Kent State University shootings, and two months before the assassination of Martin Luther King, Jr., the incident “barely penetrated the nation’s consciousness,” writes Jack Bass in his 1970 book The Orangeburg Massacre. Fifty years later, the events of the evening remain contested, and no formal investigation into the incident has ever been undertaken. Although some news organizations, including the Associated Press, characterized the shootings as a “riot” at the time, the Orangeburg massacre came after a long series of clashes with local law enforcement and politicians. The city, located between Columbia and Charleston, had about 14,000 residents at the time of the killing. Home to South Carolina State College (today South Carolina State University) and Claflin College, both HBCUs, Orangeburg “played a really important role in the activism happening throughout South Carolina,” says Jack Shuler, a professor of English at Denison University and the author of Blood and Bone: Truth and Reconciliation in a Southern Town. King himself came through the town on multiple occasions to deliver speeches, students protested for desegregation, and pastors worked to foster change throughout the community, Shuler says. “The massacre wasn’t just a random thing that happened. It was part of the longer story, which goes back to the founding of the community.” By the winter of 1968, students at the two colleges set their sights on one particular target: All-Star Bowling Lanes, owned by white proprietor Harry Floyd. Despite the passage of the 1964 Civil Rights Act, which outlawed discrimination based on race, color, religion, sex or national origin, Floyd continued to refuse African-Americans service. On February 5, a group of students went to the bowling alley and defiantly sat at the lunch counter until the police were called and the business closed early. The next day, the students returned and again entered the bowling alley, whereupon 15 of them were arrested. Hearing word of the arrests, hundreds of students poured into a parking lot nearby. Orangeburg police officers and state troopers confronted the growing crowd. Tensions began to diffuse once the arrested students were told they’d be freed, but at just that moment a fire truck arrived, causing new pandemonium. As civil rights activist and university educator Cleveland Sellers wrote in his autobiography, the fire truck suggested to the crowd that the authorities were ramping up their efforts because the powerful hoses had been turned on them during a demonstration in 1963, causing injuries and illness. Pushed against the front doors of the bowling alley in their panic, the students knocked in a glass pane and were immediately set upon by the police officers, who brutally beat several young women. As the students fled for their respective campuses, several broke shop windows and defaced cars along the way. By February 7, Orangeburg mayor E.O. Pendarvis agreed to address the students. Although the meeting was largely unproductive, the mayor did agree to share the students’ requests with the city council. Among their list of demands were a call to end police brutality, a commission on fair employment in Orangeburg, the elimination of discrimination in public services like doctors’ offices, and the creation of a biracial human relations committee. But South Carolina governor Robert McNair had already called in the National Guard, further escalating the sense of impending disaster. “Had this been a protest at Clemson or University of South Carolina [two mostly white schools that had only integrated five years prior], I have no doubt that the governor wouldn’t order in the National Guard,” says Reid Toth, associate professor of criminal justice at University of South Carolina Upstate. “If you had a group of white students marching the streets in protest of integrating, you wouldn’t have seen the governor sending in the National Guard. It comes down to a terrible part of the history of my home state, which I love, but is still to this day battling the same sense of fear—that black people are dangerous.” On the night of February 8, more than 100 students gathered on the South Carolina State campus College and began shouting at the armed officers stationed around them. While some students chanted “black power,” others began singing “We Shall Overcome.” When the students lit a bonfire to keep warm, patrolmen again called in a fire truck, exacerbating tensions. Then, at 10:30 p.m., patrolman David Shealy was injured when someone tossed a foreign object (what it was, whether a banister or something smaller, is contested) that hit him in the face. Minutes later, nine State Highway patrolmen opened fire on the unarmed students. In the aftermath, many—including Governor McNair—argued the students had began shooting first, despite there being no evidence that any students had firearms. Not only were the patrolmen using much higher caliber ammunition than called for (the standard practice for dispersing riots was to use birdshot, while the officers here used the much larger double-ought buckshot), but the vast majority of students were injured in a way that indicated they were attempting to flee. All but two “had been shot in the back, side, or through the soles of their feet,” writes Reid Toth. Although the massacre earned some national media attention, the stories disappeared quickly and many contained significant errors. (The Associated Press reported the incident included “a heavy exchange of gunfire” and never issued a correction.) “This was 1968, not 1964, and in the intervening years civil rights demonstrations had come to be seen as ‘riots’—and most whites seemed to feel that it was justified to put them down as brutally as possible,” wrote historian Dave Nolan. That’s not to say the massacre was forgotten by African-American communities; it received widespread coverage in the Chicago Defender and other newspapers, prompted marches and vigils at the University of Chicago and other South Carolina HBCUs, and led white students at a meeting of the National Student Association to organize “white alert teams” to act as buffers between black students and law officers. As for the nine patrolmen who opened fire, they were exonerated of all charges in a 1969 trial. The only person convicted of any charges in association with the massacre was Sellers, the activist who had been shot while on campus. He spent seven months in state penitentiary for inciting the protests and wasn’t pardoned until 25 years later. “I was targeted because of my work with the Student Nonviolent Coordinating Committee,” Sellers said. “I was on the FBI’s militant radical list. The jury at my trial had two African-Americans but their only possible verdict (in order to remain in South Carolina) was ‘guilty.’ South Carolina was known for forcing uppity blacks to flee.” In 2001, South Carolina governor Jim Hodges apologized on behalf of the state, and Orangeburg mayor Paul Miller issued another apology from the city in 2009. But calls for a formal state investigation of the incident by state legislators like Bakari Sellers (the son of Cleveland Sellers) have gone unanswered. For Toth, the repercussions of forgetting such important aspects of the state’s history are larger than the neglect felt by the victims and their families; they become systemic issues. She points to a lack of funding for historically black colleges and universities as an indication that historical amnesia has modern consequences. “That is part of the overall benign neglect of failing to address events, whether they’re positive or negative, that impact the black community,” Toth says. “The hardest thing I’ve ever had to do as a scholar is write research on this topic as a non-emotional objective academic, because we should know the names of the three gentlemen who were shot just as we know those in Mississippi Burning and Kent State.” Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
8e7c1b70b548dc6ba64cc4fd79efea04
https://www.smithsonianmag.com/history/1980s-far-left-female-led-domestic-terrorism-group-bombed-us-capitol-180973904/
In the 1980s, a Far-Left, Female-Led Domestic Terrorism Group Bombed the U.S. Capitol
In the 1980s, a Far-Left, Female-Led Domestic Terrorism Group Bombed the U.S. Capitol Amidst the social and political turmoil of the 1970s, a handful of women—among them a onetime Barnard student, a Texas sorority sister, the daughter of a former communist journalist—joined and became leaders of the May 19th Communist Organization. Named to honor the shared birthday of civil rights icon Malcolm X and Vietnamese leader Ho Chi Minh, M19 took its belief in “revolutionary anti-imperialism” to violent extremes: It is “the first and only women-created and women-led terrorist group,” says national security expert and historian William Rosenau. M19’s status as an “incredible outlier” from male-led terrorist organizations prompted Rosenau, an international security fellow at the think tank New America, to excavate the inner workings of the secretive and short-lived militant group. The resulting book, Tonight We Bombed the Capitol, pieces together the unfamiliar story of “a group of essentially middle-class, well educated, white people who made a journey essentially from anti-war and civil rights protest to terrorism,” he says. After their formation in 1978, M19’s tactics escalated from picketing and poster-making to robbing armored trucks and abetting prison breaks. In 1979, they helped spring explosives-builder William Morales of the Puerto Rican nationalist group FALN and Black Liberation Army organizer Assata Shakur (née Joanne Chesimard) from their respective prisons. (Both Shakur and Morales remain on the FBI’s wanted lists for terrorism and are thought to live in Cuba.) Eventually, M19 turned to building explosives themselves. Just before 11 p.m. on November 7, 1983, they called the U.S. Capitol switchboard and warned them to evacuate the building. Ten minutes later, a bomb detonated in the building’s north wing, harming no one but blasting a 15-foot gash in a wall and causing $1 million in damage. Over the course of a 20-month span in 1983 and 1984, M19 also bombed an FBI office, the Israel Aircraft Industries building, and the South African consulate in New York, D.C.’s Fort McNair and Navy Yard (which they hit twice.) The attacks tended to follow a similar pattern: a warning call to clear the area, an explosion, a pre-recorded message to media railing against U.S. imperialism or the war machine under various organizational aliases (never using the name M19). Who were these domestic terrorists sought by the FBI? Rosenau writes of “self-described ‘corn-fed girl’” Linda Sue Evans, whose politics took a radical turn while attending Michigan State University in the midst of the Vietnam War. Many M19 members’ stories echo Linda’s—college activism (at schools including Cornell, Berkeley, Radcliffe and Hampshire College) shaped their far-left worldviews, and for some, their status as out lesbians put them at odds with a heteronormative, patriarchal society. M19 membership typically followed involvement with other far-left groups. New Yorker Susan Rosenberg, one of M19’s earliest members, traveled to Cuba with the Castro-friendly Venceremos Brigade, and Italian-born Silvia Baraldini was part of a front for the militant Weather Underground. Along with several others, Alan Berkman, a Columbia-trained doctor who was one of the few men in the M19 inner circle, was involved with the John Brown Anti-Klan Committee. As M19’s spree turned more and more violent, M19’s members became evermore insular and paranoid, nearly cultish, living communally and rotating through aliases and disguises until, in 1985, law enforcement captured the group’s most devoted lieutenants. After that, Rosenau writes, “The far-left terrorist project that began with the Weathermen … and continued into the mid-1980s with May 19th ended in abject failure.” In a shocking, never-before-told story from the vaults of American history, Tonight We Bombed the US Capitol takes a close look at the explosive hidden history of M19—the first and only domestic terrorist group founded and led by women—and their violent fight against racism, sexism, and what they viewed as Ronald Reagan’s imperialistic vision for America. Smithsonian magazine asked Rosenau about the left-wing extremist group, his research process and how this case study of domestic terrorism is relevant today. Where would you position M19 relative to groups that people may be more familiar with, like the Weather Underground? They are sort of an offshoot of the Weather Underground, which essentially cracked up in the mid 1980s. These women decided to continue the armed struggle. Many of them had been in the Weather Underground, but they thought the Weather Underground had made important ideological mistakes, that the Weather Underground saw itself as a vanguard of revolution, when in fact the real revolutions were going on in the third world. Or in the United States itself, in places like Puerto Rico or among Native Americans. But the real revolutionaries were these third-world freedom fighters. And it should be the job of North American anti-imperialists, as they called themselves, to support those liberation movements in whatever way they could. So if that meant bombing the Navy to protest the role of the United States in Central America in the early 1980s, they would do that. If it meant attacking the South African consulate in New York that represented the apartheid regime [which they did in September 1984], they would do that. They really saw themselves as being as supporters and followers of these third-world struggles in the Middle East, in southern Africa and in this hemisphere particularly. They talked about themselves as being in the belly of the beast, being at the center of this imperialist monster. So they had a particular responsibility, in their view, to carry out actions to bring this monster to heel. Given how secretive M19 was, what was your research process like? I'm a historian by training, so I really concentrated my efforts on archives. Unlike a lot of other people who study terrorism, I really had plowed into court records. There were multiple trials involving the women and men of May 19th and fortunately those are all preserved in the Federal Records Centers, which are part of the National Archives. So I spent days going through boxes of federal court records, which have everything from transcripts to affidavits from FBI agents to grand jury testimony to evidence picked up at the various crime scenes. Those trial records were absolutely invaluable to really get inside this group. Two of the members had donated their papers, one to Smith College and one to Columbia University Medical Center, and these were incredibly valuable—I mean everything from high school essays to photographs of trips to Vietnam in 1975 to what looks like a picture taken before a college prom, and just things like transcripts of parole hearings. Like most terrorist groups, they tried not to leave a trail, but in fact they wound up leaving a substantial paper trail. What surprised you the most? Towards the tail end of their life cycle as a group, they really at least debated amongst themselves quite intensely the assassination of police officers, of prosecutors, of military officers. And while it's true that none of their bombings killed anyone, they certainly contemplated it. From the court records, [I learned that] they had these inventories of weapons and dynamite and detonation cord and Uzi machine guns, fully automatic with sawed-off barrels. They had incredible arsenals, and I guess they would probably argue that was for self-defense. But it seems like they were at least preparing for something much more kind of apocalyptic. Fortunately, it never happened. M19 is unique in being a woman-founded and -led terrorist organization. Did that influence its objectives or shape it in any particularly distinguishing way? They certainly were much more feminist and pro-woman than the Weather Underground, which was notoriously misogynistic. They were acutely conscious of any kind of sexism within themselves. The liberation of women, gay people, racial minorities was much more at the forefront for them than groups like the Weather Underground. It’s important to realize they didn't really believe in so-called “bourgeois feminism”, National Organization for Women, equal pay, all that stuff. Yeah, that was all nice, but they considered that a distraction; women's liberation would actually come with political revolution. And that was the important thing, right? That all these other things would flow when imperialism was defeated, when capitalism was defeated. Like a lot of terrorist organizations, what this future utopia would actually look like was left a bit vague. I think that's probably the big difference: their hatred of misogyny and their very self-conscious efforts to root out misogyny within their ranks. You write, "Despite claims by Fox News and others that Antifa activists are ‘terrorists,’ their street brawling and harassment of right-wing extremists hardly rise to the level of the left-wing political violence of the 1960s, ’70s and ’80s. The same cannot be said about the Neo-Nazi and white supremacist violence." Could you talk about the parallels or lack thereof between the left-wing terrorism you write about and some of the domestic terrorism we're seeing today? The white supremacists [today] are not obviously as structured. You do have coherent groups like Atomwaffen Division, an extremely dangerous right-wing extremist group. But one of the things they share is that ideology is extremely important to them. They have a— I wouldn't call it a coherent world view, but they have some very well-defined ideological notions about how the world works. That's very similar to May 19th and the far-left extremist groups of the ’60s and ’70s and the ’80s, that they're not crazy. Some of them are highly intelligent and articulate. They are strategic in their thinking, meaning that they have ways, ends and means. They're careful in their plotting. The idea that somehow these right-wing extremists today are just, I don't know, pissed-off young guys who hate black people and immigrants—yeah, they are. But they also have some very poisonous ideas, which actually have pretty deep roots. Systemic terrorism has been a deep, deep part of our history. After the Civil War, it's not just the Ku Klux Klan, but outright insurgency against Republicans in Southern states by white militias and white supremacists. One of the things I'm trying to bring forth in the book is this notion, to quote [Black nationalist leader H. Rap Brown], "Violence is as American as cherry pie." Terrorism is not an exception, a one-off, a random thing. It is deeply ingrained in our politics and society and history. Are there places where you see flawed comparisons or where parallels can't or shouldn't be drawn? Historical context is absolutely paramount. We kind of lump terrorism together, like groups as disparate as Students for a Democratic Society, Al Qaeda, Red Army Faction, Aum Shinrikyo, but these are all products of particular times and particular places. For example, I don't see circumstances in which left-wing, violent extremism today becomes anywhere near as it was in the early 1970s. I just don't think the conditions exist, and it's hard to imagine those conditions developing. You had the Vietnam War, a national draft... People talk about polarization now, but just look at the early 1970s where literally thousands of bombs were set off per year. The important thing is just to realize that there are some similarities, but these are very different periods in time and each period of time is unique. Lila Thulin is the digital editorial assistant for Smithsonian magazine.
5d675a6059132bd3bf3af029ce3d9263
https://www.smithsonianmag.com/history/2000-year-history-restaurants-and-other-new-books-read-180974739/?fbclid=IwAR2odqw2FbrpET81391kOgVOuk5zPVZQX2-kdhsSN6lqNzBju_lBFteSvRI
A 2,000-Year History of Restaurants and Other New Books to Read
A 2,000-Year History of Restaurants and Other New Books to Read With much of the world on lockdown amid the COVID-19 pandemic, dining out has become a cherished memory of the “before” period. Though diners can still support local restaurants by ordering food for curbside pickup or delivery, actually sitting down at any eatery, be it a tavern, café, noodle joint or fine dining establishment, appears to be off-limits for the foreseeable future. In these trying times, William Sitwell’s The Restaurant: A 2,000-Year History of Dining Out—one of five new nonfiction titles featured in this week’s books roundup—may offer some culinary comfort, enabling readers to dine vicariously via its author’s colorful prose. And, if it helps at all, know that Sitwell is similarly missing the experience of dining out. As the British restaurant critic wrote for the Telegraph earlier this month, “I’m at home, staring out of the window and dreaming of what I might eat after the crisis, quietly chastising myself that, just a few weeks ago, I felt I was tiring of all my endless eating out.” The latest installment in our “Books of the Week” series, which launched in late March to support authors whose works have been overshadowed amid the COVID-19 pandemic, details the evolution of restaurants, the women pilots of World War II, the history of sugarcane and rum production on the Yucatán Peninsula, a New York Times journalist’s struggle to come to terms with his immigrant identity, and a Wild West shootout. Representing the fields of history, science, arts and culture, innovation, and travel, selections represent texts that piqued our curiosity with their new approaches to oft-discussed topics, elevation of overlooked stories and artful prose. We’ve linked to Amazon for your convenience, but be sure to check with your local bookstore to see if it supports social distancing-appropriate delivery or pickup measures, too. Sitwell’s comprehensive history begins with the taverns and restaurants of Pompeii and concludes with a chapter on the future of dining out. (The author’s prediction: “There will be new food concepts, new cutlery, space-age environments, new-fangled digital booking systems based on your history of preferences and your current bank balance.”) Detailing the 2,000 or so years between these developments, the Telegraph food critic makes leaps through time with pit stops in the Ottoman Empire, England, New York City, India, France and other locales. Along the way, he highlights such famous foodies as Marie-Antoine Carême, a 19th-century Frenchman considered to be the first celebrity chef, and Albert and Michel Roux, the French brothers behind postwar London’s Le Gavroche restaurant, in addition to lesser-known individuals like Juvencio Maldonado, a Mexican-born immigrant whose 1951 taco machine patent powered the rise of Taco Bell, and Yoshiaki Shiraishi, a Japanese innovator whose 1958 sushi conveyor belt “revolutionized the eating of fish.” Sitwell describes restaurants as sources of cultural innovation, reasons to travel, symbols of identity, sites of entertainment and more. People seek them out for reasons beyond simple sustenance: among others, he observes, “to meet, socialize, do business, romance a loved one”—and, on the zanier end of the spectrum, even “plot a coup.” During World War II, some 1,100 Americans joined an elite class of aviators known as the Women Airforce Service Pilots (WASP). Formed by merging two existing units—the Women’s Auxiliary Ferrying Squadron (WAFS) and the Women’s Flying Training Detachment (WFTD)—in the summer of 1943, the program enabled female pilots to participate in non-combat missions essential to the war effort, including flying planes from factories to military bases, testing overhauled aircraft and towing targets used to train male air gunners practicing with live ammunition. As historian Katherine Sharp Landdeck writes in The Women With Silver Wings’ prologue, the WASPs transported 12,000 planes over 60 million miles, freeing up more than 1,100 male pilots to fly overseas for combat and, in doing so, proving “beyond a doubt that women pilots were just as skilled and tenacious as men.” Though they made significant contributions to the Allies’ eventual victory, the WASPs were disbanded in 1944, when a bill calling for the program to gain military status was narrowly defeated following backlash from civilian male pilots. Landdeck’s book details WASP members’ late-in-life efforts to ensure the women pilots’ role in the war was remembered. Central figures include Cornelia Fort, one of the 38 WASPs who died in service; WAFS leader and later ferrying operations commander Nancy Love; and wartime WASP head Jacqueline Cochran. New York Times bestselling author Tom Clavin concludes his self-described “Frontier Lawmen” trilogy with a rousing exploration of the Gunfight at the O.K. Corral, the 1881 battle that he says signaled the “last gasp of violent lawlessness … as ‘civilization’ took hold in the West.” Set against the backdrop of a “tense, hot summer” in Tombstone, Arizona, the book covers the events that led to the 30-second shootout, from the Mexican government’s crackdown on American cattle thieves to these outlaws’ increasingly brazen defiance of the law to brothers Wyatt, Virgil and Morgan Earp’s emergence as enforcers of order. At the heart of the conflict were two competing ideologies, Clavin tells Smashing Interviews magazine: the cowboys-turned-outlaws’ view of the West as a lawless haven for criminals and locals’ desire to see Tombstone become a “prosperous, civilized town.” The Wild Bill and Dodge City author’s underlying argument is that the divisions between the ostensibly “good” guys (the Earp brothers and Doc Holliday) and the “bad” (Ike and Billy Clanton and Tom and Frank McLaury) were more fluid than one might think. “[T]he Earps weren’t really the law in Tombstone and sometimes fell on the other side of the line,” notes Kirkus in its review of Tombstone, “and … the ordinary citizens of Tombstone and other famed Western venues [who] valued order and peace and weren’t particularly keen on gunfighters and their mischief.” When journalist and stand-up comedian Sopan Deb started writing his memoir in early 2018, he knew little about his parents beyond their names and the fact that they entered into an arranged marriage after immigrating to the United States from India. He couldn’t say how old they were, how many siblings they had, where exactly in India they were from, or even what their lives had been like prior to his birth. Still, Deb notes in the book’s first chapter, “Don’t get me wrong. I’m no Oliver Twist. … But there was a deep void in the relationship with my parents, a pervasive sense of unhappiness that reigned over the home.” Much of this disconnect stemmed from young Deb’s desire to blend in with his white, suburban New Jersey classmates—a refutation of Bengali identity that contrasted starkly with his parents’ pride in their heritage. By age 30, he writes, he considered the pair “distant footnotes from my past.” This mindset changed after Deb, then working as a reporter for CBS News, covered Donald Trump’s 2016 presidential campaign. “I spent my whole life running from who I am,” says Deb to NBC News’ Lakshmi Gandhi. “After covering the Trump campaign, I knew I didn’t want to run anymore.” Deb then embarked on a journey that took him to India, where his father had unexpectedly moved in 2006, and his estranged mother’s home in New Jersey. Rebuilding these relationships proved predictably challenging, and as Deb tells NBC News, the process continues to this day. “[My] book is for anyone that has a relationship with someone that should be better,” he says. “I hope that they come away from it thinking that it is never too late to bridge the gap. That doesn’t mean that you are guaranteed to succeed, but it is never too late to try.” Much like Augustus Sedgewick’s Coffeeland—a pick from the third installment in Smithsonian’s “Books of the Week” series that reveals the history of exploitation and violence behind the beloved caffeinated beverage—Gust and Matthews’ Sugarcane and Rum looks beyond the Yucatán Peninsula’s reputation as an idyllic getaway spot to expose the harsh conditions faced by its 19th-century Maya laborers. Hacienda owners implemented punitive economic systems where workers became deeply indebted to their bosses, only to then see their freedoms curtailed as a result. At the same time, the authors note, these men and women enjoyed a certain level of autonomy as an indispensable source of labor come harvest time. “What this history shows,” according to the book’s introduction, “is that sugarcane and rum are produced on a massive scale to satisfy the consumptive needs of the colonizers, which only compounds its exploitative nature as the products become available to the middle and working class.” Meilan Solly is Smithsonian magazine's assistant digital editor, humanities. Website: meilansolly.com.
aec87e82a234c5a32a8a1abe275c10d7
https://www.smithsonianmag.com/history/35-who-made-a-difference-ed-bearss-113147798/
What Made Ed Bearss a Rock Star of Civil War History
What Made Ed Bearss a Rock Star of Civil War History Ed Bearss has what might best be called a battlefield voice, a kind of booming growl, like an ancient wax-cylinder record amplified to full volume—about the way you'd imagine William Tecumseh Sherman sounding the day he burned Atlanta, with a touch of Teddy Roosevelt charging up San Juan Hill. We're on a battlefield today, in fact. But now, unlike on a certain summer day 142 years ago, this corner of southern Pennsylvania is quiet, with fields of soybeans and corn drowsing under the midmorning haze. Quiet, that is, except for that voice: "George Armstro-o-ong Custerrr has been a brigadier general for all of five days. He's already got himself the larrrgest starrrs on his shoulders of any general in the Army. He's adopted a red neckerchief with a gold arro-o-ow stickpin in it. And he's just come within a hairrr of losing his life, 13 years before the Sioux Indians send him to the happy hunting grounds." Several dozen listeners stand silent, transfixed. In Civil War circles, Bearss is nothing short of a rock star. One of the men in the tour group wears a baseball cap covered with commemorative buttons celebrating each of Bearss' birthdays for the past decade (the latest is for his 82nd), while others have been known to wear T-shirts depicting his face on Mount Rushmore or transposed onto Elvis' white jumpsuit with the simple legend: "THE KING." What inspires such adulation? As historian and battlefield guide, Bearss' store of knowledge is prodigious. Today, he's spending several hours covering a brief, relatively minor sideshow to the Battle of Gettysburg. He's speaking without notes and admits it's been years since he's read a word about the skirmish on East Cavalry Field. Yet the details pour over us in a heady flow: Rebel cavalrymen on horses exhausted after a 200-mile trek from Virginia. Michigan troopers charging into battle to Custer's cry of "Come on, you Wolverines!" A Northern captain felled when a Confederate color-bearer drives the spear point of his guidon into the Yankee's open mouth. As he talks, Bearss marches back and forth, brandishing a silver-headed swagger stick, tucking it from time to time under his withered left arm—a casualty of a bullet at a battlefield on the other side of the world in 1944. He keeps his eyes tightly closed while he lectures, and he later tells me that way he can see the events of 1863 unfolding before him. Some might say that Bearss has spent most of his life in the 19th century. He grew up with kerosene lamps and horse-drawn plows in Montana. He remembers Civil War stories told firsthand by the hometown veteran, "Grandpa" Henderson, who "used to sit around the hotel lobby with his reunion ribbons on." After serving in the Marines and earning degrees at Georgetown and Indiana universities, Bearss joined the National Park Service (where he is now chief historian emeritus) and devoted himself to the study of the American past, particularly the struggle between the blue and the gray. When he compares contemporary America to the 1860s, his allegiance is clear: "We're in an age of Teflon people now. People then were more original, more individual." Yet when he has to, Bearss can stand squarely in the present, as he has proved rather often of late, enmeshed in one 21st-century battle after another—over the suburban development that has threatened to engulf Civil War battlefields. Here at Gettysburg, for instance, the idyllic vista before us is broken by a water tower that went up a few years ago, part of a new industrial park. Just to the right of it, investors want to build a casino with 3,000 slot machines. It's a scenario that, in various permutations, has repeated itself at many sites over the past decade or so. Bearss is well-armed to support the preservationist side of the fight. He remembers visiting Manassas in 1941, when it was a sleepy rural area; now, when he leads bus tours there, they often end up stalled in shopping center traffic. At Petersburg in the early 1960s, he saw where an 1864 fort was bulldozed to make way for a mall; now the mall itself is nearly derelict. "The development is advancing more irresistibly than Grant's army did on Richmond," Bearss grumbles. "Ed's name carries a lot of weight," says Dean Shultz, a leader in the land-conservation movement at Gettysburg. Some years ago, a preservation group was debating whether to help purchase easements on the ground where Custer gathered his men for the East Cavalry Field assault. There was concern about whether the site was truly historic. "So finally I said I'd talked to Ed Bearss, and he said it had historic significance," he says. "And they said, 'Well, if Ed Bearss says it's worth saving, it's worth saving.'" Like Custer's men, preservationists now face a do-or-die moment, Bearss says. "The battles are going to be played out in the next 10 to 20 years, because by then the battlefield parks will be islands in urban corridors of the United States, in a sea of sprawling shopping malls." On East Cavalry Field, our tour draws toward a close beneath a granite column topped by a statue of a Union cavalryman. "The trumpets are playing," Bearss intones. "Thirteen hundred sabers are drawn. They flash in the sun. The Confederates are coming toward them: five regiments, riding boot to spur. Men of Michigan, are you ready? Charrrrrrrge!" And suddenly he's off, his swagger stick flailing—a hunched figure racing across the soybean field, charging fearlessly forward into the past.
eeaf845191d05e8f990f0fa92465477a
https://www.smithsonianmag.com/history/a-brief-history-of-wimbledon-156205892/?c=y&page=1
A Brief History of Wimbledon
A Brief History of Wimbledon For two weeks beginning in late June, the greatest tennis players in the world will converge on Wimbledon, a suburb on the southwestern outskirts of London. They will compete for a total of about $34.8 million in prize money, with the winners of the men's and women's singles competitions taking $2.4 million each. But more than that, they will be competing for a place in tennis history. John Barrett, a former Wimbledon player and author of Wimbledon: The Official History, says that Wimbledon is the most sought-after title in tennis because it's "the granddaddy of them all." Indeed, since the late 19th century, Wimbledon has been more than a site for the greatest players to shine; often, it has shaped the entire sport: "It is the history of tennis," Barrett says. The Overthrow of Croquet Monks and kings had played indoor ball games that resembled tennis since the Middle Ages, but it wasn't until the late 19th century that tennis acquired the form we recognize. In about 1873, an Englishman adapted indoor tennis to be played on grass, naming the game "sphairistike," after an ancient Greek game. Sphairistike quickly became popular among the idle upper classes, who were itching for a new sport to play: "The game has much more healthy and manly excitement than croquet," the Dundee Advertiser proclaimed (though the Sporting Gazette wondered "why a less jaw-breaking name could not be found"). As the game's popularity grew, various "lawn tennis" clubs—sphairistike yielding to a simpler term—arose to settle the question of just how it ought to be played. Among these was the All England Croquet Club, located near Wimbledon station, which in 1877 changed its name to the All England Croquet and Lawn Tennis Club and announced it would hold the first tennis championships, largely in order to raise money for "a pony-drawn roller for its croquet lawns," according to Cameron Brown, author of Wimbledon: Facts, Figures, and Fun. Within years, however, those croquet lawns were all but obsolete, and at one point the All England Club even dropped the word "Croquet" from its official name. Eventually it was reintroduced merely, says Barrett, "for sentimental reasons." Forging a Sport In the weeks prior to the first Wimbledon championships, the commissioners of the All England Club "achieved something truly remarkable," writes Heiner Gillmeister in Tennis: A Cultural History. "When the first ball at a Wimbledon tournament was served on Monday, 9 July 1877, they had laid down rules which have been allowed to stand until the present day, and with hardly any exception." Since then, the All England Club has been "the supreme court of appeal on the question of rules," codifying and shaping the game. This is not the only way in which Wimbledon has made tennis what it is. Since each year's championship would bring together the fiercest, most innovative players the sport has seen, the All England Club became an annual Darwinian laboratory where competitors were forced to adapt or perish. The first championships were won by a man named Spencer Gore, who employed the novel idea of approaching the net and swiftly volleying the ball left and right (his opponents, used to playing from the baseline, were flabbergasted). The following year, Gore's innovation was met with a new one, when a man named Frank Hadow in effect invented the lob shot, pitching the ball over Gore's head. A gentler game persisted at Wimbledon until 1881, when twin brothers William and Ernest Renshaw debuted the overhead serve they had been practicing against each other. The awe-struck spectators dubbed it the "Renshaw Smash," and it earned William seven titles that decade, and Ernest one. Though a mere 200 spectators had flocked to the first Wimbledon championships, the crowds had grown along with the game by the heyday of the "Renshaw Boom." Thousands were flocking to the stands by the mid-80s, and by 1905, the championships would attract competitors from overseas. Tennis had grown up rather quickly. A Game for Amateurs Perhaps surprisingly, the program for the first championships specified that only "amateurs" were allowed to compete—something that remained true at Wimbledon for more than 90 years. If this seems incomprehensible, it is because "amateur" meant something very particular to the earliest organizers at Wimbledon: "the term amateur had become a synonym of gentleman," explains Gillmeister; "the term professional … had the stigma of the manual laborer." To the elite running exclusive country clubs of the day, sport wasn't sport unless it was played purely in one's spare time—which was a lot easier to do if you could afford to build a private court on the French Riviera, as the Renshaw brothers had. It wasn't until 1968 that Wimbledon first allowed professionals—players who in some manner were paid for their tennis ability—to compete at the championships, ushering in the "open era." "Open tennis came far too late," laments Barrett. He decries that professional athletes were viewed as "second-class citizens," and says that the decades-long insistence on amateurism "held back" the entire sport of tennis. Traditions Good and Bad "Tradition is a very strong part of Wimbledon," says Barrett—a fact that accounts both for the tournament's charm and for the more unsavory bits of its history. In some ways, the history of Wimbledon is a history of an institution slowly yielding its traditions to the changing times. Women began playing at Wimbledon in 1884, seven years after the men, but it has taken until this year for Wimbledon to institute complete prize money equality. 1920 was the first year in which a woman played without wearing a corset, and it took until the 1930s until shorts were acceptable on either men (in '33) or women (in '39). Althea Gibson became the first African-American player invited to Wimbledon in 1951, and was the first black player to win the singles title, in 1957. Wimbledon refused to use yellow tennis balls, which are more easily captured by television cameras, until 1986. But Barrett says he would be loath to see one Wimbledon tradition disappear: grass. Wimbledon is the last of the four Grand Slam tournaments (the others are the French, Australian, and U.S. Opens) to use grass courts. "It would be a sad day if we ever failed to play it on grass," says Barrett, who loves the surface because it "is never the same two days running, so you have to be able to adapt very quickly." And naturally, the longstanding Wimbledon tradition of eating strawberries and cream is also widely loved: in one recent year, spectators consumed 59,000 pounds of strawberries and nearly 2,000 gallons of cream. There is one tradition, though, that Barrett and most of his fellow Englishmen would like to see broken: that of the English consistently losing at their own tournament. The last woman to win the singles at Wimbledon was Virginia Wade in 1977; the last man, Fred Perry in 1936. David Zax has written brief histories of the Orient Express and the Honus Wagner baseball card. David Zax is a freelance journalist and a contributing editor for Technology Review (where he also pens a gadget blog).
614071a7580a1a35840b035dfc512b07
https://www.smithsonianmag.com/history/a-murder-in-salem-64885035/
A Murder in Salem
A Murder in Salem On the evening of April 6, 1830, the light of a full moon stole through the windows of 128 Essex Street, one of the grandest houses in Salem, Massachusetts. Graced with a beautifully balanced red brick facade, a portico with white Corinthian columns and a roof balustrade carved of wood, the three-story edifice, built in 1804, was a symbol of prosperous and proper New England domesticity. It was owned by Capt. Joseph White, who had made his fortune as a shipmaster and trader. A childless widower, White, then 82, lived with his niece, Mary Beckford (“a fine looking woman of forty or forty-five,” according to a contemporary account), who served as his housekeeper; Lydia Kimball, a domestic servant; and Benjamin White, a distant relative who worked as the house handyman. Beckford’s daughter, also named Mary, had once been part of the household, but three years earlier she had married young Joseph Jenkins Knapp Jr., known as Joe, and now lived with him on a farm seven miles away in Wenham. Knapp was previously the master of a sailing vessel White owned. That night, Captain White retired a little later than was his habit, at about 9:40. At 6 o’clock the following morning, Benjamin White arose to begin his chores. He noticed that a back window on the ground floor was open and a plank was leaning against it. Knowing that Captain White kept gold doubloons in an iron chest in his room, and that there were many other valuables in the house, he feared that burglars had gained access to it. Benjamin at once alerted Lydia Kimball and then climbed the elegant winding stairs to the second floor, where the door to the old man’s bedchamber stood open. Captain White lay on his right side, diagonally across the bed. His left temple bore the mark of a crushing blow, although the skin was not broken. Blood had oozed onto the bedclothes from a number of wounds near his heart. The body was already growing cold. The iron chest and its contents were intact. No other valuables had been disturbed. I first read of the Salem murder many years ago in a Greenwich Village secondhand bookshop. I’d ducked inside to escape a sudden downpour, and as I scanned the dusty shelves, I discovered a battered, coverless anthology of famous crimes, compiled in 1910 by San Francisco police captain Thomas Duke. The chapter on Captain White’s savage killing, evocative of the golden age mystery tales of the late 19th century, riveted me at once. The famed lawyer and congressman Daniel Webster was the prosecutor at the ensuing trial. His summation for the jury—its inexorable cadence, the slow gathering of dreadful atmospheric details—tugged at my memory, reminding me of Edgar Allan Poe’s tales of terror. In fact, after talking with Poe scholars, I learned that many of them agreed the famous speech had likely been the inspiration for Poe’s story “The Tell-Tale Heart,” wherein the narrator boasts of his murder of an elderly man. Moreover, I discovered, the murder case had even found its way into some of Nathaniel Hawthorne’s works, with its themes of tainted family fortunes, torrential guilt and ensuing retribution. Those facts alone proved an irresistible magnet to a crime historian like me. But the setting—gloomy, staid Salem, where in the 1690s nineteen men and women were convicted of witchcraft and hanged—endowed the murder case with another layer of gothic intrigue. It almost certainly fed the widespread (and admittedly lurid) fascination with the sea captain’s death among the American public at the time. The town, according to an 1830 editorial in the Rhode Island American, was “forever...stained with blood, blood, blood.” Soon after the discovery of the body, Stephen White—the murdered man’s nephew and a member of the Massachusetts legislature—sent for Samuel Johnson, a prominent Salem physician, and William Ward, Captain White’s clerk and business assistant. Ward made note of the plank at the open window, and near it he discovered two muddy footprints he believed had been made by the intruder. Decades before footprints were generally recognized as important evidence, Ward carefully covered them with a milk pan to shield them from the fine mist that had begun to fall. Meanwhile, Dr. Johnson’s cursory examination revealed the body was not quite cold; he concluded that death had occurred three to four hours earlier. Dr. Johnson then performed an autopsy before a “coroner’s jury” comprised of local citizens, whose role was to assess the initial facts and determine whether a crime had taken place. In the jury’s presence, Johnson carefully examined the corpse, stripping off the shirt and inserting probes into some of the stab wounds to determine their depth and direction. He counted 13 stab wounds—“five stabs in the region of the heart, three in front of the left pap [nipple], and five others, still further back, as though the arm had been lifted up and the instrument struck underneath.” He attributed all the stab wounds to the same weapon, which suggested that there had been a single murderer. Though the wounds had oozed, there was no sign of spurting or spraying blood. Johnson interpreted this to mean that the blow to the head had come first, either killing White or stunning him, thereby slowing his circulation. Uncertain as to which of the many wounds was fatal, Johnson believed that a more complete autopsy was necessary. This was performed on April 8, at 5:30 in the evening. Dr. Abel Peirson, a medical colleague, assisted Johnson. A second post-mortem as thorough as this one was unusual in early 19th- century criminal investigations. In 1830, forensic science was still largely a footnote in legal and medical texts. But thanks to increasingly rigorous anatomical studies in medical schools, there had been progress in identifying murder instruments based upon the nature of the wounds and determining which had been the most likely cause of death. The surgeons agreed that the skull fracture was due to a single severe blow from a cane or bludgeon, and that at least some of the chest wounds were caused by a dirk (short dagger), the cross-guard of which had struck the ribs with enough force to break them. Peirson disagreed with Johnson’s initial assessment that there was likely only one assailant. A medical consensus was elusive, in part, because of the 36-hour interval between inquest and the second autopsy—which had allowed extensive post-mortem changes, affecting the appearance of the wounds, as had Johnson’s initial insertion of a probe. Stephen White gave the Salem Gazette permission to publish the autopsy findings. “However revolting the subject may be,” the newspaper said, “we have deemed it our duty to lay before our readers every particle of authentic information we can obtain, respecting the horrible crime which has so shocked and alarmed our community.” The possibility that more than one assailant might have been involved and that a conspiracy might be afoot fueled unease. Salem residents armed themselves with knives, cutlasses, pistols and watchdogs, and the sound of new locks and bolts being hammered in place was everywhere. Longtime friends grew wary of each other. According to one account, Stephen White’s brother-in-law, discovering that Stephen had inherited the bulk of the captain’s estate, “seized White by the collar, shook him violently in the presence of family” and accused him of being the murderer. Town fathers attempted to calm matters by organizing a voluntary watch and appointing a 27-man Committee of Vigilance. Although not burdened by any experience in criminal investigation, its members were given the power to “search any house and interrogate every individual.” Members took an oath of secrecy and offered a $1,000 reward for information “touching [on] the murder.” But the investigation went nowhere; the committee was confronted with a scenario of too many suspects and too little evidence. No one had made plaster casts of the incriminating footprints that Ward had carefully covered the morning of the murder. (By 1830, scientists and sculptors were using plaster casts for preserving fossil specimens, studying human anatomy and recreating famous sculptures—but the technique was not yet de rigueur in criminal investigations.) Since nothing had been stolen, the assailant’s motive puzzled townspeople and authorities alike. But revenge was not out of the question. As many in Salem knew, Joseph White was hardly the “universally respected and beloved” old man one local newspaper described. A bit of a domestic tyrant, he was given to changing his will at a whim and using his large fortune as a weapon to enforce his wishes. When his pretty young grandniece Mary announced her engagement to Joe Knapp, the old man declared Joe a fortune hunter, and when the marriage went forward without his consent, White disinherited Mary and fired Knapp. What’s more, White had been a slave trader. The ownership of slaves was abolished in Massachusetts in 1783 and the slave trade outlawed five years later. Yet White had boasted to Salem minister William Bentley in 1788 that he had “no reluctance in selling any part of the human race.” (In Bentley’s estimation, this “betray[ed] signs of the greatest moral depravity.”) In a water-stained letter written in 1789 that I found deep in the archives of the Peabody Essex Museum in Salem, a sailor named William Fairfield, who served on the schooner Felicity, told his mother about a slave revolt that had killed the ship’s captain. Joseph White was one of the owners of the Felicity. Some of White’s ships had engaged in legitimate trade, hauling everything from codfish to shoes. But many had sailed from Salem laden with tools and trinkets, to be traded in Africa for human cargo. Manacled and cramped into ghastly holds, many of the captives did not survive the voyage. Those who did were traded in the Caribbean for gold—enough to buy property, build a mansion and fill an iron chest. “Many maritime families in Salem supported the slavery system in one way or another,” says Salem historian Jim McAllister. That was how they had built their fortunes and paid their sons’ Harvard tuitions. There was an understanding in Salem society that this shameful business was best not spoken of, particularly in Massachusetts, where antislavery sentiments ran high. “A few of our merchants, like others in various seaports, still loved money more than the far greater riches of a good conscience, more than conformity with the demands of human rights, with the law of the land and the religion of their God,” Salem minister Joseph B. Felt wrote in 1791. A little more than a week after the murder, Stephen White received a letter from a jailer 70 miles away in New Bedford. The letter said an inmate named Hatch, a petty thief, claimed he had crucial information. While frequenting gambling houses in February, Hatch had overheard two brothers, Richard and George Crowninshield, discussing their intent to steal Joseph White’s iron chest. The Crowninshield brothers were the disreputable scions of an eminent Salem family. Richard, according to court transcripts, was known to favor Salem’s “haunts of vice.” The town’s Committee of Vigilance brought Hatch in chains to testify before a Salem grand jury. On May 5, 1830, the jury indicted Richard Crowninshield for murder. His brother George—and two other men who were in his company at the gambling house—were charged with abetting the crime. All were detained in the Salem Gaol, a grim edifice of granite blocks, iron-barred windows and brick-walled cells. Then, on May 14, Joseph Knapp Sr., the father of the man who had married White’s disinherited grandniece, received a letter from Belfast, Maine. It demanded a “loan” of $350, and threatened disclosure and ruin if this were not promptly paid. It was signed “Charles Grant.” The senior Knapp could make no sense of the matter and asked his son for advice. It’s “a devilish lot of trash,” Joe Knapp Jr. told his father and advised him to give it to the committee. The Committee of Vigilance pounced on the letter. It sent $50 anonymously to Grant at his local post office, with a promise of more to come, and a man was dispatched to apprehend whoever collected the money. The recipient turned out to be John C.R. Palmer. Arrested as a possible accessory to the murder, but promised immunity for his testimony, he told a complex tale: during a stay at the Crowninshield family home, Palmer had overheard George tell Richard that John Francis (“Frank”) Knapp, a son of Joseph Knapp Sr., wanted them to kill Captain White—and that Joe Jr., Frank’s brother, would pay them $1,000 to commit the crime. The Committee of Vigilance promptly arrested the Knapp brothers and sent them to the Salem Gaol, their cells not far from those occupied by the Crowninshields. At first, Richard Crowninshield exuded a sense of rectitude, certain that he would be found innocent. During his imprisonment, he asked for books on mathematics and Cicero’s Orations, and conveyed nonchalance—until the end of May, when Joe Knapp confessed to his role in the murder plot. The confession was given to the Rev. Henry Colman, an intimate friend of the White family. Colman also had close connections to the Committee of Vigilance, and in this role had promised Joe immunity from prosecution in exchange for his testimony. The nine-page confession—in Colman’s handwriting but signed by Knapp—began, “I mentioned to my brother John Francis Knapp, in February last, that I would not begrudge one thousand dollars that the old gentleman, meaning Capt. Joseph White of Salem, was dead.” It went on to explain that Joe Knapp believed if Captain White died without a legal will, his fortune would be divided among his close relatives, giving Mary Beckford, Knapp’s mother-in-law, a considerable fortune. To this end, Joe opened Captain White’s iron chest four days before the murder and stole what he erroneously believed to be the old man’s legal will. The true last will of Joseph White, favoring his nephew Stephen, was safely in the office of the dead man’s lawyer. But Joe was unaware of this fact. He hid the document in a box he covered with hay and burned the stolen paper the day after the murder. Joe and Frank had debated how to commit the murder. They considered ambushing White on a road or attacking him in his house. Frank, however, told Joe that “he had not the pluck to do it,” and suggested hiring Richard and George Crowninshield, whom the Knapp brothers had known since adolescence. After several meetings, the Knapps and the Crowninshields gathered at the Salem Common at 8 p.m. on April 2 to finalize the plan. Richard, Joe confessed, had thoughtfully displayed the “tools” he planned to use for the project. Using his machinist’s skills, he had manufactured one of the murder weapons—a club—himself. It was “two feet long, turned of hard wood...and ornamented...with beads at the end to keep it from slipping....The dirk was about five inches long on the blade...sharp at both edges, and tapering to a point.” That same evening, after stealing what he believed to be the will, Joe Knapp “unbarred and unscrewed” a window in Captain White’s house. Four days later, at 10 p.m., Richard Crowninshield entered the front yard through the garden gate and climbed through the unlocked window to murder White. The detailed confession pointed to Richard Crowninshield as the principal perpetrator of the deed: he would surely hang. But Richard learned from defense attorney Franklin Dexter that Massachusetts law did not allow the trial of an accessory to a crime unless the principal had first been tried and convicted. Richard must have seen a way to exercise his ingenuity one last time and perhaps save his brother and friends. On June 15, at 2 in the afternoon, a jailer found Richard’s body hanging by its neck from two silk handkerchiefs tied to the bars of his cell window. The Commonwealth of Massachusetts, it seemed, had been cheated out of an open-and-shut case, unless the state could find a legal basis for putting the other three men on trial. Newspaper reporters descended on Salem from as far away as New York City—ostensibly with the lofty goal of ensuring that justice be achieved. In the words of pioneering journalist James Gordon Bennett, then a correspondent for the New York Courier: “The press is the living jury of the Nation!” The prosecution in the White case faced a quandary. Not only was there no prior conviction of the principal (due to Richard Crowninshield’s suicide), but Joe Knapp was refusing to testify and uphold his confession. So the prosecution turned to Senator Daniel Webster of Boston, the New Hampshire-born lawyer, lawmaker and future secretary of state, perhaps best remembered for his efforts to hammer out compromises between Northern and Southern states that he believed would forestall civil war. Webster, then 48, had served several terms in the House of Representatives before being elected to the U.S. Senate in 1827. He was a close friend of such Salem area notables as Stephen White and Supreme Court Justice Joseph Story. Webster’s commanding presence, his dramatic dark coloring and his relentless gaze had earned him the sobriquet “Black Dan.” In the courtroom he was known to be fierce at cross-examination and riveting at summation—“the immortal Daniel,” the New Hampshire Patriot and States Gazette had called him. Asked by Stephen White to aid prosecutors at the murder trial, Webster was torn. Throughout his lengthy legal career, he had always stood for the defense. A large part of his reputation rested on his passionate oratory on behalf of the accused. Further, his personal connections with the victim’s friends and relatives raised delicate issues of legal ethics. On the other hand, if he stood by his friends, the favor would someday be repaid. Then there was the handsome fee of $1,000 that Stephen White had discreetly arranged for his services. Webster, a heavy drinker who had a tendency to spend beyond his means and was chronically in debt, agreed to “assist” the prosecution—which meant, of course, that he would lead it. The accused men had chosen to be tried separately, and the first to come to trial, in August 1830, was Frank Knapp. Interest ran high. Bennett reported that the crowds trying to enter the courtroom to see Webster were “like the tide boiling up on the rocks.” With Richard Crowninshield dead—“There is no refuge from confession but suicide, and suicide is confession,” Webster famously said—Webster’s intent was to establish Frank Knapp as a principal rather than an accessory. Several witnesses testified they had seen a man wearing a “camlet cloak” and a “glazed cap,” such as Frank often wore, late on the night of the murder, on Brown Street, behind the White property. Webster argued that Frank was there to give direct aid to the murderer, and was therefore a prime actor. The defense challenged the witnesses’ identification and scoffed that Frank’s mere presence on Brown Street could have provided vital aid. The jury deliberated for 25 hours before announcing they were deadlocked. The judge declared a mistrial. The case was scheduled to be retried two days later. The second trial brought debate over the forensic evidence to the fore. In the first trial, only Dr. Johnson had testified. But this time the prosecution included the formal testimony of Dr. Peirson. His dissenting opinion on the autopsy—that there possibly had been two assailants—had been widely read in the Salem Gazette. Now Peirson was being used as an expert witness in an apparent attempt to cast doubt on the theory that Richard Crowninshield had acted alone in the lethal assault upon Joseph White. Webster speculated that Knapp might have delivered the “finishing stroke;” or that the other wounds had been inflicted “from mere wantonness.” Knapp’s defense attorney ridiculed the argument, wondering aloud why Knapp would return to the house to stab a dead body: “Like another Falstaff did he envy the perpetrator the glory of the deed and mean to claim it as his own?” In the interim between the two trials, the new jury had been exposed to newspaper accounts of the first hearing, as well as to heavy criticism leveled at the previous jury for its failure to convict. Thus encouraged, the second jury listened intently as Webster captivated the courtroom with a dramatic re-creation of the crime: “A healthful old man, to whom sleep was sweet, the first sound slumbers of the night held him in their soft but strong embrace. The assassin enters, through the window already prepared . . . With noiseless foot he paces the lonely hall, half-lighted by the moon; he winds up the ascent of the stairs, and reaches the door of the chamber. Of this, he moves the lock, by soft and continued pressure, till it turns on its hinges without noise; and he enters, and beholds his victim before him....” Webster’s summation was later deemed a masterpiece of oratory. “The terrible power of the speech and its main interest lie in the winding chain of evidence, link by link, coil by coil, round the murderer and his accomplices,” British literary critic John Nichol wrote. “One seems to hear the bones of the victim crack under the grasp of a boa-constrictor.” Samuel McCall, a prominent lawyer and statesman, called the speech “the greatest argument ever addressed to a jury.” After just five hours of deliberation the jury accepted Webster’s contention that Frank Knapp was a principal to the crime and convicted him of murder. “The town now begins to grow rather more quiet than it has been since the murder of Mr. White,” Nathaniel Hawthorne wrote in a letter to a cousin, “but I suppose the excitement will revive at the execution of Frank Knapp.” Hawthorne, a still-struggling, 26-year-old writer living in his mother’s home in Salem, was riveted by the case. The son and grandson of respected sea captains, he was also a descendant of John Hathorne, one of the infamous hanging judges of the witchcraft trials. The family connection both fascinated and repelled the future novelist, and no doubt informed his lifelong interest in crime and inherited guilt. At the time of the Knapp trial, Hawthorne was writing short fiction for local papers, including the Salem Gazette, which covered the story assiduously. Some scholars have suggested that Hawthorne wrote some of the newspaper’s unsigned articles about the murder, though there is no hard evidence to support that. In letters, Hawthorne described the town’s “universal prejudice” against the Knapp family and expressed his own ambivalence about the jury’s verdict: “For my part, I wish Joe to be punished, but I should not be very sorry if Frank were to escape.” On September 28, 1830, before a crowd of thousands, Frank Knapp was hanged in front of Salem Gaol. His brother Joseph, tried and convicted in November, met the same fate three months later. George Crowninshield, the remaining conspirator, had spent the night of the murder with two ladies of the evening, who provided him with an alibi. After two trials he was acquitted by a now-exhausted court. The two men who had been in the company of George in the gambling house were discharged without trial. By September 9, 1831, Hawthorne was writing to his cousin that, “The talk about Captain White’s murder has almost entirely ceased.” But echoes of the trial would reverberate in American literature. Two decades later, Hawthorne found inspiration in the White murder in writing The Scarlet Letter (1850). Margaret Moore—the former secretary of the Nathaniel Hawthorne Society and the author of The Salem World of Nathaniel Hawthorne—argues that Webster’s ruminations on the uncontrollable urge to confess influenced Hawthorne’s portrayal of the Rev. Arthur Dimmesdale in The Scarlet Letter. Dimmesdale is tortured by the secret of being the lover of Hester Prynne—and when Hester hears Dimmesdale’s final sermon, Hawthorne writes, she could detect “the complaint of the human heart, sorrow-laden, perchance guilty, telling its secret, whether of guilt or sorrow, to the great heart of mankind; beseeching its sympathy or forgiveness—at every moment—in each accent....” The late Harvard University literary scholar Francis Otto Matthiessen argued that echoes of the White murder and Webster’s summation also found their way into The House of Seven Gables (1851). The opening chapter sets the gothic tone by describing the Pyncheon family’s sordid history—the murder 30 years prior of the family patriarch, “an old bachelor and possessed of great wealth in addition to the house and real estate.” Later in the novel, Hawthorne devotes 15 pages to an unnamed narrator who describes and taunts the corpse of the tyrannical Judge Pyncheon. Matthiessen saw Webster’s influence particularly in the way Hawthorne used the imagery of moonlight: “Observe that silvery dance upon the upper branches of the pear-tree, and now a little lower, and now on the whole mass of boughs, while, through their shifting intricacies, the moonbeams fall aslant into the room. They play over the Judge’s figure and show that he has not stirred throughout the hours of darkness. They follow the shadows, in changeful sport, across his unchanging features.” The White murder also left its mark on Edgar Allan Poe, who at the time of the crime was poised to enter the U.S. Military Academy at West Point (which he left after one year by deliberately getting court-martialed for disobedience). Nobody knows if Poe followed the trial as it occurred, but by 1843, when he published “The Tell-Tale Heart,” he had clearly read about it. Poe scholar T. O. Mabbott has written that Poe relied critically on Webster’s summation in writing the story. At the trial, Webster spoke of the murderer’s “self-possession” and “utmost coolness.” The perpetrator, he added, ultimately was driven to confession because he believed the “whole world” saw the crime in his face and the fatal secret “burst forth.” Likewise, Poe’s fictional murderer boasts of “how wisely” and “with what caution” he killed an old man in his bedchamber. But the perfect crime comes undone when Poe’s murderer—convinced that the investigating police officers know his secret and are mocking him—declares, “I felt that I must scream or die!...I admit the deed!” The spellbinding summation Daniel Webster delivered at the trial was printed as part of an anthology of speeches later that year and sold to an admiring public. But Black Dan’s political ambitions took a turn for the worse in 1850 when, belying his years of opposition to slavery, he gave an impassioned speech defending the new Fugitive Slave Act, which required Northern states to aid in the return of escaped slaves to their Southern masters. The legislation was part of a compromise that would allow California to be admitted to the Union as a “free state.” But abolitionists perceived the speech as a betrayal and believed it to be an attempt by Webster to curry favor with the South in his bid to become the Whig Party’s presidential candidate in 1852, and he lost the nomination. Webster died shortly thereafter from an injury resulting from a carriage accident. The autopsy revealed the cause of death to be a brain hemorrhage, complicated by cirrhosis of the liver. For its part, Salem would become an important center of antislavery activism. Prior to Frederick Douglass’ emergence as a national figure in the 1840s, Salem native Charles Lenox Remond was the most famous African-American abolitionist in the United States and Europe. His sister, Sarah Parker Remond, also lectured abroad, and often shared the podium with Susan B. Anthony at antislavery conventions. Salemites would make every effort to put the White murder behind them. Even a century after the trial, the town was reluctant to speak of it. Caroline Howard King, whose memoir When I Lived in Salem appeared in 1937, destroyed the chapter about the crime before publication, judging it to be “indiscreet.” In 1956, when Howard Bradley and James Winans published a book about Webster’s role in the trial, they initially encountered resistance when conducting their research. “Some people in Salem preferred to suppress all reference to the case,” Bradley and Winans wrote, and “there were still people who viewed inquiries about the murder with alarm.” Today, the Salem witch trials drive the town’s tourist trade. But, every October, you can go on historian Jim McAllister’s candlelight “Terror Trail” tour, which includes a stop at the scene of the crime, now known as the Gardner-Pingree House. You can also tour the inside of the house—a national historic landmark owned by the Peabody Essex Museum—which has been restored to its 1814 condition. The museum possesses—but doesn’t exhibit—the custom-made club that served as the murder weapon. I was allowed to inspect it, standing in a cavernous storage room wearing a pair of bright blue examination gloves. The club is gracefully designed and fits easily in the hand. I couldn’t help but admire Richard Crowninshield’s workmanship. Crime historian E.J. Wagner is the author of The Science of Sherlock Holmes. Chris Beatrice is a book and magazine illustrator who lives in Massachusetts.
5f842bfc5bb0503b87be03591f204c2c
https://www.smithsonianmag.com/history/a-northern-family-confronts-its-slaveholding-past-88307/
A Northern Family Confronts Its Slaveholding Past
A Northern Family Confronts Its Slaveholding Past When Katrina Browne discovered that her New England ancestors, the DeWolfs, were the largest slave-trading family in U.S. history, she invited DeWolf descendents to retrace the Triangle Trade route and confront this legacy. Traces of the Trade: A Story from the Deep North, which airs June 24 on the PBS film series P.O.V., follows their journey and documents the North's intimate relationship with slavery. Browne's cousin Thomas DeWolf has also written a book about the trip, Inheriting the Trade: A Northern Family Confronts Its Legacy as the Largest Slave-Trading Dynasty in U.S. History. This year is the bicentennial of the federal abolition of the slave trade. How did you first find out about your family's history and why did you want to make a film about it? I was in seminary in my late 20s—I was 28-years-old—and I got a booklet that my grandmother sent to all her grandchildren. She was 88 and coming to the end of her life and wondering if her grandkids actually knew anything about their family history—whether they cared. She was conscientious enough to put in a couple sentences about the fact that our ancestors were slave traders. It hit me incredibly hard when I read those sentences. I probably would have just treated the whole thing as my problem to reckon with on my own with my family, privately, if I hadn't come across a book by historian Joanne Pope Melish called Disowning Slavery. She traced the process whereby the northern states conveniently forgot that slavery was a huge part of the economy. Slavery itself existed in New England for over 200 years. History books leave most of us with the impression that because it was abolished in the North before the South, it was as if it never happened in the North, that we were the good guys and abolitionists and that slavery was really a Southern sin. That book made me realize what I had done with my own amnesia, and my family's amnesia was really parallel to this much larger regional dynamic. That's what inspired me to make this film—that showing me and my family grappling with it would give other white Americans an opportunity to think and talk about their own intimate feelings, wherever their family history may lie, and that it would also set Americans straight about the history. What did you find out about how and why the DeWolfs first got into the trade? They were sailors and worked their way up to being slave ship captains. People typically would buy shares in slave ships and become part owners, and if you were successful you became a full owner. It was really [James DeWolf] who became extremely successful. He had a number of sons who all were in the slave trade. That's how it really became a dynasty—three generations in 50 years. How did they use the Triangle Route, from Rhode Island to Ghana to Cuba and back? In the late 18th century rum became a commodity that was in demand—it rose to the top as a commodity of interest on the West African coast as part of the slave trade. So more and more rum distilleries were built in Rhode Island and Massachusetts. The DeWolfs had a rum distillery—they would take rum to West Africa, they would trade it for people and then bring those captured Africans to, most frequently, Cuba and Charleston, South Carolina, but also to other Caribbean ports and other Southern states. In Cuba, they also owned sugar and coffee plantations. The molasses from the sugar plantations was a key ingredient for the rum-making. They had an auction house in Charleston, and they developed their own insurance company and bank. Your family wasn't the only Northern family involved in this trade. How widespread was the practice and how did it impact the North's economy? It would probably come as a surprise to most people that Rhode Island, despite being the smallest state in the country, was actually the largest slave-trading state in terms of the number of Africans brought on ships leaving from Rhode Island ports. The ships were often built by Massachusetts ship builders. The rope, the sails, the shackles, the other commodities were traded in addition to rum. Connecticut had a lot of farms, and a large portion of the commodities cultivated for trade were sent to [the West Indies]. The islands were typically turned into one-crop islands, where you turned all the land into sugar, tobacco, coffee—these commodities that were in demand. They weren't growing as much food [on the islands], so the food would be brought from Connecticut. People may be surprised to learn that your family and others continued the trade well past when it was made illegal, in 1808. How were they able to do that? Prior to 1808, various states passed laws outlawing the slave trade, but they weren't enforced practically at all. The DeWolfs and pretty much everyone else traded up until it was federally abolished in 1808. Thomas Jefferson was president at the time and he proposed they should close the trade. After 1808 a lot of people did quit the trade, including James DeWolf, but his nephew decided to ignore even that law, and he continued to trade until about 1820—at that point it became a capital offense, where you could be executed. It's interesting to think about how possible it was to be doing something that was not only completely immoral, but also illegal, and get away with it. With their Cuban slave-trading buddies they would sell one of their ships to one of their buddies for a dollar, and then it would be going around the triangle with the Cuban flag on it, and then they'd buy it back. How did the DeWolfs' wealth and privilege manifest itself in the Bristol community? The DeWolfs were under the jurisdiction of Newport, and the Newport customs collector believed in enforcing the state law. They wanted to get around the law so they lobbied Congress to create a separate customs district, and they succeeded. Then they recommended their brother-in-law, Charles Collins, to be appointed collector of ports, and that's who Thomas Jefferson appointed. Collins was part owner of one of their Cuban plantations. People including the Newport collector protested the appointment. It was brought to Jefferson and his Secretary of the Treasury's attention, and they didn't do anything about it. The DeWolfs were major campaign contributors to Thomas Jefferson. One can only assume that he wasn't going to cause trouble for them. When you and your nine relatives arrived in Ghana and then in Cuba, what remnants of the trade did you see? In Ghana we visited the slave forts—there were dozens of them up and down the coast and some of them have been turned into historic sites protected by UNESCO. It's very intense to go to the dungeons where people were held and where you know your ancestors had been. I'd brought so much defensiveness to the conversation before, some of which has to do with my ancestors and some of which has to do with being white in America. Something happened for me, being there, where I could just pull away that defensiveness and the very natural reaction became pure empathy—imagining what it would be like to be a descendent of people who had been brutalized in that way. When you visited Ghana it was during Panafest, which is attended by many African Americans. What is that event, and what was it like to be in the midst of it? We were totally nervous and always walking on eggshells. It's a time of pilgrimage for people of African descent who, for many, are the first ones to be coming back to West Africa since their ancestors were taken away. The reactions that we encountered were completely across the board—from people who really appreciated our being there and our desire to face the history to people who really resented us being there and felt we were invading their space. It was such a sacred moment for them that the last people they wanted to see were white Americans, let alone descendents of slave traders. How did your family members' attitudes toward their slave-trading history—or towards contemporary race issues—change as the trip progressed? A lot of us were really inspired to get involved in public policy debates—the reparations debate and how to think about repair. I think everyone [on the trip] would say we have a sense of responsibility because we know that we had a leg up, and therefore we think there's a responsibility to use those privileges to make a difference. Most of us would say we don't feel personally guilty.
e1c06f198c1b68f9e79b80d8be5ff4b6
https://www.smithsonianmag.com/history/a-symbol-that-failed-149514383/
A Symbol That Failed
A Symbol That Failed It's big for a brooch, about six inches across and maybe two and a half high. But because it's partly transparent, and cleverly hinged to fit the curves of a lady's body, it does not seem clunky. Tiny diamonds etch its design--olive branches with leaves--and, perched symmetrically upon them, eight doves. Altogether, it's fairly typical of the work of the famed French jewelry and glass designer, René Lalique. Ordinarily you find this brooch at the Smithsonian's National Museum of American History. But lately it has been at the Cooper-Hewitt, National Design Museum, in New York, in an exhibition of Lalique's fin de siècle jewelry, glass designs and drawings. Lalique created the olive branch brooch around 1906, and back then the birds were described as pigeons. Its story grows dim for a decade, in fact until December 1918, a month after the original Armistice Day ending the Great War. The "pigeons" suddenly turned into doves of peace, the olive branches took on full significance and the design became the perfect symbolic gift from the citizens of Paris to Edith Bolling Wilson, wife of President Woodrow Wilson. Catastrophic World War I was declared a "war to end wars." America had intervened to end the fighting. Now here Wilson was in Europe, conducting talks everyone hoped would prevent the outbreak of another. The impact of the President's visit is hard to imagine today. World War I, the Great War of 1914-1918, has faded into history, and Wilson's high-minded hopes for permanent peace now seem simpleminded or ironic. In the France of December 1918, the savage destruction of four years still lay heavy on the land and in the hearts of Europeans. Villages lay in rubble, forests had been hewed by storms of shrapnel, hills leveled and meadows cratered by high explosives. Politicians hadn't been able to stop the insanity. Generals could only suggest sending more troops slithering through the mud to die in further attacks. It was only when, at last, Woodrow Wilson poured in American troops and later offered his idealistic Fourteen Points for restructuring Europe that Germany, starving and rebellious, surrendered. Four weeks after the armistice, the Wilsons arrived in Brest aboard an impounded prewar German liner named George Washington (to make American tourists feel comfortable). France went wild. Guns roared in salute; bands crashed out "The Star-Spangled Banner" and "La Marseillaise," great crowds jammed the streets of Paris, shouting "Vive Veelson!" The lanky, professorial pince-nez-wearing President was all but sainted as a war-ender and forger of a new and better world. And everyone loved pretty, dark-haired Edith Wilson, his new wife, tall, and what is best described as "shapely," with a ready smile and easy charm. Edith Bolling Galt Wilson--undeniably "F.F.V." (First Families of Virginia) with roots going back to John Rolfe and the Indian princess Pocahontas--met and married the President in 1915. He was a lonely widower and she a widow. She gave him the companionship and loving support that he craved. Their voyage to France at the end of the war would have seemed like a second honeymoon except for the pressure Wilson felt to negotiate a just peace with all those tricky and vengeful European leaders. On December 16 at the Hotel de Ville (City Hall), the President was given a gold medal. Edith Wilson noted in her journal that "... to my surprise, the master of ceremonies then turned to me and presented a beautiful Lalique box containing a most unusual pin composed of six doves of peace...." Well, she counted wrong, but the entry indicates gratitude for this "pretty conceit." She went to Paris again in 1919, for the signing of the peace treaty. "I wore an unusual gown by Worth," she tells us, and "the great pin with the diamonds and doves of peace...." The Lalique brooch shows up in a portrait of her, done in 1920 by Seymour Stone. A dispute arose and the portrait never hung at the White House. Perhaps it also reminded her of a tragic time in her life and the life of the President. Wilson had collapsed during his 1919 "peace" tour of the nation, a demanding trip around the United States, undertaken, despite ill-health, to rouse public support for the peace treaty that he'd played such a large role in constructing, and especially for the League of Nations. The treaty, based on his Fourteen Points, required a League of Nations, and the United States, he believed, must surely join the League to add to its peacekeeping credibility. But he found that the League, which sounded so sensible in war-torn Europe, rubbed a lot of Americans the wrong way, especially Massachusetts Senator Henry Cabot Lodge. Everyone wanted the treaty, all right, but ties with Europe made people leery about one day having to pull European chestnuts out of the fire--again. The President's splendid oratory won him admiration and affection, as he toured the country, but did not produce the flood of pro-League telegrams to Washington that he sought. Political opponents claimed he had forgotten the workings of a democracy. He didn't ask for support, he demanded it in the name of national virtue. Even his French colleague, Georges Clemenceau, found himself bored with Wilson's Fourteen Points: "Why," he exclaimed, "God Almighty has only ten!" And H. L. Mencken, ever watching for an open shot at big game, declared that Wilson was waiting for "the first vacancy in the Trinity." The President drove himself cruelly in a losing cause. In Kansas, he collapsed and was rushed home. He seemed to get a little better, then took a fall and suffered a stroke. Thereafter he was unable to handle the work of the Presidency, and the normal running of the country slowed to a crawl. With only two years of formal schooling but devoted to keeping tedious chores away from her husband, Edith Wilson checked every letter, every request for a decision, even every bill to sign. It was claimed she signed some of them for her husband, but most she shelved without an answer. Newly arrived ambassadors weren't received, candidates for empty Cabinet posts were left twisting in the wind. Vice President Thomas Marshall, famous mostly for remarking that "what this country needs is a really good five-cent cigar," slumped into a fit of depression when someone suggested that he might have to take over the reins. "Presidentess" Edith had firm hold of them. Rumors flew that the President was mad--and indeed the meager communications from the White House often made little sense. Letters to the President from members of the Cabinet would be answered in "a large, school-girlish handwriting" that meandered all over the page. Mistrust of the highest office--almost unheard of in those innocent days--appeared and grew, and anger focused on the only people who had access to the President: his doctor, Cary Grayson, his long-trusted secretary, Joseph P. Tumulty, and finally the second Mrs. Wilson. The Baltimore Sun wrote of congressional suspicions that the idleness of the White House must be blamed on "the dark and mysterious Mr. Tumulty, or, more sinister still, must we look for the woman in the case?" President Wilson never recovered. Congress adopted the treaty but rejected U.S. entry into the League of Nations. As the election of 1920 approached, the Democratic candidates for President and Vice President, James Cox of Ohio and a chap named Franklin Delano Roosevelt, called on the sick old man. Joining the League would be part of his platform, Cox promised. That finished him. The Republicans' "available man," Warren Harding, won handily; the League was forgotten for good. So, it seemed, was Wilson. Beaten and shattered, he clung to life till a bleak February morning in 1924. Then the country suddenly remembered, and crowds knelt in the street outside the house in Washington. Edith Wilson lived on, dedicating herself to fiercely safeguarding the memory of her husband. No one knows what the League of Nations might have done if the United States had joined, but without us the League of Nations proved spectacularly fruitless in maintaining peace. After World War II mankind created its strange stepson, the United Nations. Edith Wilson lived to see it all. In 1961, as a "little old lady" in her late 80s--and just a few months before her death--she sat beside President John F. Kennedy as he signed a bill authorizing a memorial to Woodrow Wilson. He gave her the pen. She took it gratefully. "I didn't dare ask for it," she smiled. They both knew that was a fib.
315cb6ecf9abf9137a563e7646d2d3f0
https://www.smithsonianmag.com/history/abraham-lincoln-only-president-have-patent-131184751/?utm_source=facebook.com&utm_medium=socialmedia
Abraham Lincoln Is the Only President Ever to Have a Patent
Abraham Lincoln Is the Only President Ever to Have a Patent Upon hearing the name Abraham Lincoln, many images may come to mind: rail-splitter, country lawyer, young congressman, embattled president, Great Emancipator, assassin's victim, even the colossal face carved into Mount Rushmore. One aspect of this multidimensional man that probably doesn't occur to anyone other than avid readers of Lincoln biographies (and Smithsonian) is that of inventor. Yet before he became the 16th president of the United States, Lincoln, who had a long fascination with how things worked, invented a flotation system for lifting riverboats stuck on sandbars. [×] CLOSE Photo Gallery Though his invention was never manufactured, it serves to give Lincoln yet another honor: he remains the only U.S. president to have a patent in his name. According to Paul Johnston, curator of maritime history at the National Museum of American History (NMAH), Lincoln's eminence and the historical rarity of his patent make the wooden model he submitted to the Patent Office "one of the half dozen or so most valuable things in our collection." Lincoln's patent, No. 6,469, was granted on May 22, 1849, for a device for "Buoying Vessels Over Shoals," when he was back in Springfield practicing law after one term as an Illinois congressman in Washington. His idea, to equip boats with inflatable bellows of "india-rubber cloth, or other suitable water-proof fabric" levered alongside the hull, came as a result of river and lake expeditions he made as a young man, ferrying people and produce on the Mississippi and the Great Lakes. At least twice his boats ran aground on sandbars or hung up on other obstacles; given the Big River's ever-shifting shallows, such potentially dangerous misadventures happened often. Freeing a beached vessel usually involved the laborious unloading of cargo until the boat rode high enough to clear the snag. According to Harry R. Rubenstein, chair of the Division of Politics and Reform at NMAH, Lincoln "was keenly interested in water transportation and canal building, and enthusiastically promoted both when he served in the Illinois legislature." He was also an admirer of patent law, famously declaring that it "added the fuel of interest to the fire of genius." Lincoln appears to have had more than a passing affinity for mechanical devices and tools. William H. Herndon, his law partner at the time he was working on his invention, wrote that Lincoln "evinced a decided bent toward machinery or mechanical appliances, a trait he doubtless inherited from his father who was himself something of a mechanic...." The precise source of the model of the flotation device is unclear, though there's no doubt that the ingenuity behind it was Lincoln's. Herndon wrote about Lincoln bringing the wooden boat model into the law office, "and while whittling on it would descant on its merits and the revolution it was destined to work in steamboat navigation." A Springfield mechanic, Walter Davis, was said to have helped with the model, which was just over two feet long. But Johnston thinks it's possible that the detailed miniature Lincoln submitted may have been made by a model maker in Washington who specialized in aiding inventors. "The name engraved on top of the piece is 'Abram Lincoln,'" Johnston says. "It doesn't seem likely that if Lincoln had actually made this model, he'd have misspelled his own first name." Johnston says that the answer—yet undetermined—may lie in whether the misspelled name is also engraved under the original varnish, indicating the model to be a commission. The patent application for the device has a similar mystery. Part of the U.S. Patent Office collection, the document describes in detail how "by turning the main shaft or shafts in one direction, the buoyant chambers will be forced downwards into the water and at the same time expanded and filled with air." But it is missing the inventor's signature. Someone, probably in the early 20th century, cut Abe's signature out of the document—the autograph collector as vandal. Since no one ever tried to put the invention to use, we can't know for sure if it would have led to the revolution in steamboat navigation that Lincoln predicted. But "it likely would not have been practical," says Johnston, "because you need a lot of force to get the buoyant chambers even two feet down into the water. My gut feeling is that it might have been made to work, but Lincoln's considerable talents lay elsewhere." Owen Edwards is a freelance writer who previously wrote the "Object at Hand" column in Smithsonian magazine.
3c5a9edcec4d37ce14396610fb736c76
https://www.smithsonianmag.com/history/alexander-hamiltons-adultery-and-apology-18021947/
Alexander Hamilton’s Adultery and Apology
Alexander Hamilton’s Adultery and Apology In the summer of 1791, Alexander Hamilton received a visitor. Maria Reynolds, a 23-year-old blonde, came to Hamilton’s Philadelphia residence to ask for help. Her husband, James Reynolds, had abandoned her—not that it was a significant loss, for Reynolds had grossly mistreated her before absconding. Hamilton, just 34, was serving as secretary of the United States treasury and was himself a New Yorker; she thought he would surely be able to help her return to that city, where she could resettle among friends and relatives. Hamilton was eager to be of service, but, he recounted later, it was not possible at the moment of her visit, so he arranged to visit her that evening, money in hand. When he arrived at the Reynolds home, Maria led him into an upstairs bedroom. A conversation followed, at which point Hamilton felt certain that “other than pecuniary consolation would be acceptable” to Maria Reynolds. And thus began an affair that would put Alexander Hamilton at the front of a long line of American politicians forced to apologize publicly for their private behavior. Hamilton (whose wife and children were vacationing with relatives in Albany) and Maria Reynolds saw each other regularly throughout the summer and fall of 1791—until James Reynolds returned to the scene and instantly saw the profit potential in the situation. December 15, Hamilton received an urgent note from his mistress: I have not tim to tell you the cause of my present troubles only that Mr. has rote you this morning and I know not wether you have got the letter or not and he has swore that If you do not answer It or If he dose not se or hear from you to day he will write Mrs. Hamilton he has just Gone oute and I am a Lone I think you had better come here one moment that you May know the Cause then you will the better know how to act Oh my God I feel more for you than myself and wish I had never been born to give you so mutch unhappiness do not rite to him no not a Line but come here soon do not send or leave any thing in his power. Two days later, Hamilton received a letter from James Reynolds that accused him of destroying a happy home and proposed a solution: Its true its in your power to do a great deal for me, but its out of your power to do any thing that will Restore to me my Happiness again for if you should give me all you possess would not do it. god knowes I love the woman and wish every blessing may attend her, you have bin the Cause of Winning her love, and I Dont think I Can be Reconciled to live with Her, when I know I hant her love. now Sir I have Considered on the matter Serously. I have this preposial to make to you. give me the Sum Of thousand dollars and I will leve the town and take my daughter with me and go where my Friend Shant here from me and leve her to Yourself to do for her as you thing proper. I hope you wont think my request is in a view of making Me Satisfaction for the injury done me. for there is nothing that you Can do will compensate for it. Rather than leave town (and his new mark), James Reynolds allowed the relationship to continue. A pattern was established in which Maria Reynolds (by this time likely complicit in her husband’s scheme) would write to Hamilton, entreating him to visit when her husband was out of the house: I have kept my bed those tow days past but find my self mutch better at presant though yet full distreesed and shall till I se you fretting was the Cause of my Illness I thought you had been told to stay away from our house and yesterday with tears I my Eyes I beged Mr. once more to permit your visits and he told upon his honnour that he had not said anything to you and that It was your own fault believe me I scarce knew how to beleeve my senses and if my seturation was insupportable before I heard this It was now more so fear prevents my saing more only that I shal be miserable till I se you and if my dear freend has the Least Esteeme for the unhappy Maria whos greateest fault Is Loveing him he will come as soon as he shall get this and till that time My breast will be the seate of pain and woe P. S. If you cannot come this Evening to stay just come only for one moment as I shal be Lone Mr. is going to sup with a friend from New York. After such trysts occurred, James Reynolds would dispatch a request for funds—rather than demand sums comparable to his initial request of $1,000 dollars (which Hamilton paid), he would request $30 or $40, never explicitly mentioning Hamilton’s relationship with Maria but referring often to Hamilton’s promise to be a friend to him. James Reynolds, who had become increasingly involved in a dubious plan to purchase on the cheap the pension and back-pay claims of Revolutionary War soldiers, found himself on the wrong side of the law in November 1792, and was imprisoned for committing forgery. Naturally, he called upon his old friend Hamilton, but the latter refused to help. Reynolds, enraged, got word to Hamilton’s Republican rivals that he had information of a sort that could bring down the Federalist hero. James Monroe, accompanied by fellow Congressmen Frederick Muhlenberg and Abraham Venable, visited Reynolds in jail and his wife at their home and heard the tale of Alexander Hamilton, seducer and homewrecker, a cad who had practically ordered Reynolds to share his wife’s favors. What’s more, Reynolds claimed, the speculation scheme in which he’d been implicated also involved the treasury secretary. (Omitted were Reynolds’ regular requests for money from Hamilton.) Political enemy he might have been, but Hamilton was still a respected government official, and so Monroe and Muhlenberg, in December 1792, approached him with the Reynolds’ story, bearing letters Maria Reynolds claimed he had sent her. Aware of what being implicated in a nefarious financial plot could do to his career (and the fledgling nation’s economy), Hamilton admitted that he’d had an affair with Maria Reynolds, and that he’d been a fool to allow it (and the extortion) to continue. Satisfied that Hamilton was innocent of any wrongdoing beyond adultery, Monroe and Muhlenberg agreed to keep what they’d learned private. And that, Hamilton thought, was that. James Monroe had a secret of his own, though. While he kept Hamilton’s affair from the public, he did make a copy of the letters Maria Reynolds had given him and sent them to Thomas Jefferson, Hamilton’s chief adversary and a man whose own sexual conduct was hardly above reproach. The Republican clerk of the House of Representatives, John Beckley, may also have surreptitiously copied them. In a 1796 essay, Hamilton (who had ceded his secretaryship of the treasury to Oliver Wolcott in 1795 and was acting as an adviser to Federalist politicians) impugned Jefferson’s private life, writing that the Virginian’s “simplicity and humility afford but a flimsy veil to the internal evidences of aristocratic splendor, sensuality, and epicureanism.” He would get his comeuppance in June 1797, when James Callender’s The History of the United States for 1796 was published. Callender, a Republican and a proto-muckraker, had become privy to the contents of Hamilton’s letters to Reynolds (Hamilton would blame Monroe and Jefferson, though it is more likely Beckley was the source, though he had left his clerk’s position). Callender’s pamphlet alleged that Hamilton had been guilty of involvement in the speculation scheme and was more licentious than any moral person could imagine. “In the secretary’s bucket of chastity,” Callender asserted, “a drop more or less was not to be perceived.” Callender’s accusations and his access to materials related to the affair left Hamilton in a tight spot—to deny all the charges would be an easily proven falsehood. The affair with Maria Reynolds could destroy his marriage, not to mention his hard-won social standing (he had married Elizabeth Schuyler, daughter of one of New York’s most prominent families, and a match many thought advantageous to Hamilton). But to be implicated in a financial scandal was, to Hamilton, simply unthinkable. As Secretary of the Treasury, he’d been the architect of early American fiscal policy. To be branded as corrupt would not only end his career, but also threaten the future of the Federalist Party. Left with few other options, Hamilton decided to confess to his indiscretions with Maria Reynolds and use that confession as proof that on all other fronts, he had nothing to hide. But his admission of guilt would be far more revealing than anyone could have guessed. Hamilton’s pamphlet Observations on Certain Documents had a simple purpose: in telling his side of the story and offering letters from James and Maria Reynolds for public review, he would argue that he had been the victim of an elaborate scam, and that his only real crime had been an “irregular and indelicate amour.” To do this, Hamilton started from the beginning, recounting his original meeting with Maria Reynolds and the trysts that followed. The pamphlet included revelations sure to humiliate Elizabeth Hamilton—that he and Maria had brought their affair into the Hamilton family home, and that Hamilton had encouraged his wife to remain in Albany so that he could see Maria without explanation. Letters from Maria to Hamilton were breathless and full of errors (“I once take up the pen to solicit The favor of seing again oh Col hamilton what have I done that you should thus Neglect me”). How would Elizabeth Hamilton react to being betrayed by her husband with such a woman? Still, Hamilton pressed on in his pamphlet, presenting a series of letters from both Reynoldses that made Hamilton, renowned for his cleverness, seem positively simple. On May 2, 1792, James Reynolds forbade Hamilton from seeing Maria ever again; on June 2, Maria wrote to beg Hamilton to return to her; a week after that, James Reynolds asked to borrow $300, more than double the amount he usually asked for. (Hamilton obliged.) Hamilton, for his part, threw himself at the mercy of the reading public: This confession is not made without a blush. I cannot be the apologist of any vice because the ardor of passion may have made it mine. I can never cease to condemn myself for the pang which it may inflict in a bosom eminently entitled to all my gratitude, fidelity, and love. But that bosom will approve, that, even at so great an expense, I should effectually wipe away a more serious stain from a name which it cherishes with no less elevation than tenderness. The public, too, will, I trust, excuse the confession. The necessity of it to my defence against a more heinous charge could alone have extorted from me so painful an indecorum. While the airing of his dirty laundry was surely humiliating to Hamilton (and his wife, whom the Aurora, a Republican newspaper, asserted must have been just as wicked to have such a husband), it worked—the blackmail letters from Reynolds dispelled any suggestion of Hamilton’s involvement in the speculation scheme. Still, Hamilton’s reputation was in tatters. Talk of further political office effectively ceased. He blamed Monroe, whom he halfheartedly tried to bait into challenging him to a duel. (Monroe refused.) This grudge would be carried by Elizabeth Hamilton, who, upon meeting Monroe before his death in 18251831, treated him coolly on her late husband’s behalf. She had, by all accounts, forgiven her husband, and would spend the next fifty years trying to undo the damage of Hamilton’s last decade of life. Hamilton’s fate, of course, is well-known, though in a way the Reynolds affair followed him to his last day. Some time before the publication of his pamphlet, Hamilton’s former mistress Maria Reynolds sued her husband for divorce. The attorney that guided her through that process was Aaron Burr. Sources: Chernow, Ron. Alexander Hamilton, Penguin Books, 2005; Hamilton, Alexander. Observations on Certain Documents, 1797; Callender, James. History of the United States in 1796, 1796; Brodie, Fawn McKay. Thomas Jefferson: An Intimate History, W.W. Norton & Co., 1975; Collins, Paul. Duel With the Devil: The True Story of How Alexander Hamilton and Aaron Burr Teamed Up to Take on America’s First Sensational Murder Mystery, Crown, 2013; McCraw, Thomas K., The Founders and Finance: How Hamilton, Gallatin, and Other Immigrants Forged a New Economy, Belknap Press, 2012, Rosenfeld, Richard M. American Aurora: A Democratic-Republican Returns, St. Martin’s Griffin, 1998. Angela Serratore is a writer and a contributing editor at Smithsonian.com
bce72f99e9ed97335e04077f2367c8c7
https://www.smithsonianmag.com/history/alfred-w-crosby-on-the-columbian-exchange-98116477/
Alfred W. Crosby on the Columbian Exchange
Alfred W. Crosby on the Columbian Exchange In 1972, Alfred W. Crosby wrote a book called The Columbian Exchange. In it, the historian tells the story of Columbus’s landing in 1492 through the ecological ramifications it had on the New World. At the time of publication, Crosby’s approach to history, through biology, was novel. “For historians Crosby framed a new subject,” wrote J.R. McNeil, a professor at Georgetown University, in a foreword to the book’s 30th anniversary edition. Today, The Columbian Exchange is considered a founding text in the field of environmental history. I recently spoke with the retired professor about “Columbian Exchange”—a term that has worked its way into historians’ vernacular—and the impacts of some of the living organisms that transferred between continents, beginning in the 15th century. You coined the term “Columbian Exchange.” Can you define it? In 1491, the world was in many of its aspects and characteristics a minimum of two worlds—the New World, of the Americas, and the Old World, consisting of Eurasia and Africa. Columbus brought them together, and almost immediately and continually ever since, we have had an exchange of native plants, animals and diseases moving back and forth across the oceans between the two worlds. A great deal of the economic, social, political history of the world is involved in the exchange of living organisms between the two worlds. When you wrote The Columbian Exchange, this was a new idea—telling history from an ecological perspective. Why hadn’t this approach been taken before? Sometimes the more obvious a thing is the more difficult it is to see it. I am 80 years old, and for the first 40 or 50 years of my life, the Columbian Exchange simply didn’t figure into history courses even at the finest universities. We were thinking politically and ideologically, but very rarely were historians thinking ecologically, biologically. What made you want to write the book? I was a young American historian teaching undergraduates. I tell you, after about ten years of muttering about Thomas Jefferson and George Washington, you really need some invigoration from other sources. Then, I fell upon it, starting with smallpox. Smallpox was enormously important until quite modern times, until the middle of the 20th century at the latest. So I was chasing it down, and I found myself reading the original accounts of the European settlements in Mexico, Peru or Cuba in the 16th, 17th and 18th centuries. I kept coming across smallpox just blowing people away. So I thought there must be something else going on here, and there was—and I suppose still is. How did you go about your research? It was really quite easy. You just have to be prepared somehow or other to notice the obvious. You don’t have to read the original accounts in Spanish or Portuguese. There are excellent English translations dating back for generations. Practically all of them will get into a page or two or ten about the decimation of American Indians, or a page about how important maize is when all European crops fail, and things like that. I really didn’t realize that I was starting a revolution in historiography when I got into this subject. So, how were the idea and the book received at first? That is kind of interesting. I had a great deal of trouble getting it published. Now, the ideas are not particularly startling anymore, but they were at the time. Publisher after publisher read it, and it didn’t make a significant impression. Finally, I said, “the hell with this.” I gave it up. And a little publisher in New England wrote me and asked me if I would let them have a try at it, which I did. It came out in 1972, and it has been in print ever since. It has really caused a stir. What crops do you consider part of the Columbian Exchange? There was very little sharing of the main characters in our two New World and Old World systems of agriculture. So practically any crop you name was exclusive to one side of the ocean and carried across. I am thinking about the enormous ones that support whole civilizations. Rice is, of course, Old World. Wheat is Old World. Maize, or corn, is New World. The story of wheat is the story of Old World civilization. Thousands of years ago, it was first cultivated in the Middle East, and it has been a staple for humanity ever since. It is one of Europe’s greatest gifts to the Americas. Maize was the most important grain of the American Indians in 1491, and it is one of the most important grain sources in the world right now. It is a standard crop of people not only throughout the Americas, but also southern Europe. It is a staple for the Chinese. It is a staple in Indonesia, throughout large areas of Africa. If suddenly American Indian crops would not grow in all of the world, it would be an ecological tragedy. It would be the slaughter of a very large portion of the human race. Maize, potatoes and other crops are important not only because they are nourishing, but because they have different requirements of soil and weather and prosper in conditions that are different from other plants. What ideas about domesticating animals traveled across the ocean? American Indians were very, very roughly speaking the equal of Old World farmers of crops. But American Indians were inferior to the Old World raisers of animals. The horse, cattle, sheep and goat are all of Old World origin. The only American domesticated animals of any kind were the alpaca and the llama. One of the early advantages of the Spanish over the Mexican Aztecs, for instance, was that the Spanish had the horse. It took the American Indians a little while to adopt the horse and become equals on the field of battle. You talk about the horse being an advantage in war. What other impacts did the adoption of domesticated horses have on the Americas? Horses not only helped in war but in peace. The invaders had more pulling power—not only horses but also oxen and donkeys. When you consider the great buildings of the Old World, starting with the Egyptians and running up through the ages, people in almost all cases had access to thousands of very strong animals to help them. If you needed to move a ton of whatever in the Old World, you got yourself an animal to help you. When you turn to the Americas and look at temples, you realize people built these. If you need to move a ton in the New World, you just got a bunch of friends and told everybody to pull at the same time. What diseases are included in the Columbian Exchange? The Old World invaders came in with a raft of infectious diseases. Not that the New World didn’t have any at all, but it did not have the numbers that were brought in from the Old World. Smallpox was a standard infection in Europe and most of the Old World in 1491. It took hold in areas of the New World in the early part of the next century and killed a lot of American Indians, starting with the Aztecs and the people of Mexico and Peru. One wonders how a few hundred Spaniards managed to conquer these giant Indian empires. You go back and read the records and you discover that the army and, just generally speaking, the people of the Indian empires were just decimated by such diseases as smallpox, malaria, all kinds of infectious diseases. Megan Gambino is an editor and writer for Smithsonian.com and founded “Document Deep Dive.” Previously, she worked for Outside magazine in New Mexico.
cf833174ede25ebfc289c9cf03287628
https://www.smithsonianmag.com/history/alice-ramseys-historic-cross-country-drive-29114570/
Alice Ramsey’s Historic Cross-Country Drive
Alice Ramsey’s Historic Cross-Country Drive On June 9, 1909, in a rain drenched New York City, a crowd of wet photographers gathered at 1930 Broadway to snap pictures of an “automobile” and the four poncho-cloaked women within. The car itself was a dark-green, four-cylinder, 30-horsepower 1909 Maxwell DA, a touring car with two bench seats and a removable pantasote roof. But the cameras focused particular attention on the woman in the driver’s seat, 22-year-old Alice Ramsey. Just over five feet tall, with dark hair below her rubber helmet and visor, she posed until she could stand it no more; then she kissed her husband goodbye and cranked the motor to start the car’s engine. Off the Maxwell drove with a clank of tire chains, westward on a transcontinental crusade: the first all-female, cross-country road trip. Ramsey hadn’t set out to make feminist history—ironically, two men laid the groundwork for her trip. Her husband set the wheels in motion the previous year, after a “monster” scared Ramsey’s horse when it sped past at 30 miles per hour; John Rathbone Ramsey thought it wise to purchase his wife a car as well. Ramsey took to driving, and that summer she clocked 6,000 miles traveling the mostly dirt “highways” near her Hackensack, New Jersey, home. When she entered an endurance drive, a 200-mile trip to and from Montauk, a man representing automaker Maxwell-Briscoe Company marveled at her driving prowess and came up with an idea. He proposed an all-expenses-paid trip, courtesy of the company, if Ramsey showed the world that a Maxwell could take anyone—even a woman driver—all the way across America. To accompany her on the trip, Ramsey brought Nettie Powell and Margaret Atwood, her “conservative” sisters-in-law, both in their 40s; and Hermine Jahns, an enthusiastic 16-year-old friend. Ramsey and her three passengers had to learn the basics of car safety, wear hats and goggles, and cover their long dresses with dusters to protect themselves from dirt and dust. They spent nights at hotels and ate restaurant food and much-appreciated home-cooked meals, when possible; at other times, they picnicked on bread or, during one early morning stop in Utah, a breakfast of coffee, corn flakes, and canned tomatoes scrounged from a general store. Soon the Maxwell reached Ohio; driving the Cleveland Highway they set a personal best, attaining “the terrific speed of 42 miles per hour.” Though the Maxwell-Briscoe Company would publish an ad upon arrival stating that the group traveled “without a particle of car trouble,” this was far from the truth. Already, Ramsey had fixed at least one tire blowout and had called for a mechanic to repair a coil in Syracuse, waiting near their car as someone in the crowd cried “Get a horse!” as Ramsey would recall. In the Midwest, the car ran out of gas. The women had forgotten to check the tank, a process that required the driver and her seatmate  to leave the car, remove the front seat cushion, and stick a ruler into the Maxwell’s specially fitted 20-gallon fuel tank. The next day, moving through mud in low gear overworked the car, and the transmission needed water. There was no extra on board, so Powell and Atwood proved their mettle by using their toothbrush and toiletries holders—made of cut-glass and sterling silver—to transport water ounce by ounce from road-side ditches to the radiator. Perhaps certain car problems were unavoidable. After all, the trip put the Maxwell to the test for long days on difficult roads. Iowa’s weather posed particular challenges. There was “no gumbo too thick” for the Maxwell, said its manufacturers, but some potholed, muddy roads proved practically impassable for the tread-less tires. It was slow-moving and, in one case, no-moving: the women slept beside an overflowed creek until the water receded enough that they could ford it. They persevered through the region, taking 13 days to conquer 360 miles (and relying on horses for towing at times!). Because the automobile industry was yet in its infancy, America’s roads were not yet designed for long-distance driving. For navigation, Ramsey relied on the Blue Book series of automotive guides, which gave directions using landmarks. But sometimes the route changed faster than the books. The women struggled to find a “yellow house and barn” at which they were supposed to turn left; a horse-loyal farmer had deliberately foiled drivers by repainting in green. Worse, there were no books for regions west of the Mississippi River. The Maxwell took worn routes, at crossroads following the telegraph poles “with the greatest number of wires,” according to Ramsey. On certain days, the Maxwell-Briscoe Company hired pilot cars familiar with the area to lead them. Even so, the party sometimes hit a dead end at a mine or sandpit and had to backtrack for miles. Beyond the physical triumph of survival, pride also came from the public’s enthusiastic support. Locals rode horses for miles and waited by roadsides for hours to catch a glimpse of the Ramsey team. Ramsey recalled a Western Union telegraph boy in Chicago who stared “dumbfounded” at the women. Though it was now typical to see females travel short distances, a cross-country trip had been tried only a handful of times and never accomplished. Only six years had passed since Dr. Horatio Nelson Jackson’s 1903 drive marked the first male cross-country success. When they entered California, Ramsey and her passengers marveled at the sugar pines and redwoods, of which “None of us had ever seen the like.” The same could be said for the media’s reaction upon their arrival. “PRETTY WOMEN MOTORISTS ARRIVE AFTER TRIP ACROSS THE CONTINENT” the San Francisco Chronicle proclaimed. “The car for a lady to drive,” self-congratulated the Maxwell-Briscoe Company. It was August 7, 1909, and they had made it. In total, the trip had taken 59 days and covered 3,800 miles. After her brief bout with fame, Ramsey returned to New Jersey by train, where she resumed a relatively low-key profile raising two children. She continued her cross-country drives, losing count after her thirtieth. In 1960, the Automobile Manufacturers Association named her their “First Lady of Automotive Travel” for her trek across a “trackless land.” The next year Ramsey published Veil, Duster, and Tire Iron, a chronicle of the 1909 trip. She later drove five of the six passes of the Swiss Alps, giving up the last under doctor’s orders regarding her pacemaker. Ramsey died in 1983. The achievements of the Maxwell-Briscoe Company were shorter-lived; Chrysler absorbed the company in 1926. In 1999, when Alaska Airlines Magazine printed an article about the 90th anniversary of Ramsey’s trip, the story inspired car buff Richard Anderson and Emily, his daughter. On June 9, 2009, Anderson, a 37-year-old, Seattle-based event manager and new mother, will commemorate the drive’s centennial by making her own cross-country trip in a 1909 Maxwell rebuilt by her father. Learning to drive the Maxwell has been challenging at times. Anderson often misses second gear and struggles with the clutch and brake, which use the same pedal, and she has been known to stall mid-intersection. But she calls her challenges “easy, when I consider what [Alice Ramsey] had to face.” There is one trial that, if accomplished, might impress even Ramsey: wearing period garb, Anderson and co-pilot Christie Catania will begin their trip by navigating through Manhattan on a weekday morning during rush hour! Richard Anderson has already had to explain himself and his seatbelt-free car to one concerned police officer during a practice drive. Whether the car will also face flack for its lack of blinkers (they will use hand signals to turn) or slow pace (the Maxwell still maxes out near 40 mph) remains to be seen. But if the precedent set by Ramsey holds, there will be no problem with the authorities: throughout her entire driving career, she received just one ticket. She had made an illegal U-turn—though not, of course, on her famed cross-country trip—in 1909, Ramsey forged only straight ahead.
ce69e87e65897c38b210e9f95f8b0716
https://www.smithsonianmag.com/history/amazing-if-true-story-submarine-mechanic-who-blew-himself-then-surfaced-secret-agent-queen-victoria-180951905/
The Amazing (If True) Story of the Submarine Mechanic Who Blew Himself Up Then Surfaced as a Secret Agent for Queen Victoria
The Amazing (If True) Story of the Submarine Mechanic Who Blew Himself Up Then Surfaced as a Secret Agent for Queen Victoria At 8:45 on the evening of February 17, 1864, Officer of the Deck John Crosby glanced over the side of the Federal sloop-of-war Housatonic and across the glassy waters of a calm Atlantic. His ship was blockading the rebel port of Charleston from an anchorage five miles off the coast, and there was always the risk of a surprise attack by some Confederate small craft. But what Crosby saw that night, by the dim light of a wintry moon, was so strange that he couldn’t be certain what it was. “Something on the water," he recalled to a court of inquiry a week later, "which at first looked to me like a porpoise, coming to the surface to blow.” Crosby alerted the Housatonic’s quartermaster, but the object had already disappeared—and when he saw it again, a moment later, it was too close to the sloop for any hope of escape. As the Housatonic’s crew scrambled to their battle stations, there was a huge explosion on the starboard side. Their ship sank in minutes, taking five crewmen with her. It was not clear until some time later that the Housatonic had been the first victim of a new weapon of war. The ship—all 1,240 tons of her—had been sunk by the Confederate submarine H.L. Hunley: 40 feet of hammered iron, hand-cranked by a suicidally brave crew of eight men, and armed with a 90-pound gunpowder charge mounted on a spar that jutted, as things turned out, not nearly far enough from her knife-slim bow. The story of the Housatonic and the Hunley, and of the Hunley’s own sinking soon after her brief moment of glory, of her rediscovery in 1995 and her eventual salvage in 2000, has been told many times. We know a good deal now about Horace Hunley, the Louisiana planter who assembled the syndicate that paid for the submarine. We know about the design defects and the human errors that drowned two earlier Hunley crews, 13 men in all. We even know a little of James McClintock and Baxter Watson, the two mechanics who built the Hunley—not least that McClintock was the man who actually designed her, and so is probably the most important person in the story. What has not been known, at least until now, is exactly what became of James McClintock. The Hunley’s hundreds of historians sketch his story in a sentence or two. They take their information from McClintock’s grandson, Henry Loughmiller, who—writing to the researcher Eustace Williams—explained that his ancestor was “killed [in 1879] at the age of 50 in Boston Harbor when he was experimenting with his newly invented submarine mine.” It seems to be a fitting ending, but the Loughmiller account has been repeated endlessly for more than half a century without being checked. Yet fresh research suggests that each part of the story is dubious. Those who met James McClintock in 1879 thought him much closer to 60 years old than 50; the explosion that supposedly claimed his life took place outside Boston Harbor, and the evidence that it killed him is remarkably flimsy. Many people heard the explosion, but not a single person witnessed it. There was no body. There was no inquest. Not so much as a shred of mangled flesh was ever recovered from the water. And 16 months later, in November 1880, a man who said his name was James McClintock walked into the British consulate in Philadelphia to tell a most outlandish tale—and offer his services to Queen Victoria as a secret agent. James McClintock spent his boyhood navigating not eastern harbors, but the great rivers of the American interior. Census records confirm that the inventor was born in Ohio, and family tradition suggests that he grew up in Cincinnati and left home at an early age to join the crew of a Mississippi riverboat, acquiring sufficient skill to become “the youngest steamboat captain on the river” in the years before the Civil War. At some point, McClintock also began to show talent as an engineer and inventor. Caught in New Orleans by the war, he and Baxter Watson drew up plans for a new machine for making Minié balls, the rifled-musket bullets used by both sides throughout the conflict. According to the New Orleans Bee, the two men boasted that their invention would cost only $2,000 or $3,000 to make, and “with it two men can turn out a thousand balls per hour, or with steam power it makes eight or ten thousand per hour. This one machine, worked night and day, could turn out 1,200,000 balls every week, more than enough to supply the Confederate armies in the most desperate and extended war possible.” The Minié ball machine was never made, most likely because its usefulness had been thoroughly exaggerated. But it served as a calling card, and must have helped to persuade Horace Hunley to assemble a consortium that invested somewhere north of $30,000 in McClintock’s submarines. Reading between the lines of Civil War accounts, it seems likely that it was the desire to recover this investment, as much as patriotic fervor, that persuaded the boats’ owners to persevere in the face of repeated disaster: at least three sinkings, reported stiflings and near-stiflings, and even the death of Hunley himself, who, having fatally dived to the bottom during trials at Charleston in October 1863, was recovered with his crew when the submarine was salvaged three weeks later—“a spectacle,” one contemporary report related, “indescribably ghastly; the unfortunate men were contorted into all kinds of horrible attitudes, some clutching candles, evidently attempting to force open the manholes; others lying in the bottom, tightly grappled together, and the blackened faces of all presented the expression of their despair and agony.” Of all the men known to have boarded the Hunley, indeed, only about half a dozen escaped death in her belly. But McClintock himself survived the war, and when, in the fall of 1872, he traveled to Canada in an attempt to sell his submarine designs to the Royal Navy, the officers who interviewed him proclaimed themselves “strongly impressed with the intelligence of Mr. McClintock, and with his knowledge on all points, chemical and mechanical, connected with torpedoes and submarine vessels.” What led McClintock to Boston is only hazily known. By 1879 he was living in New Albany, on the Ohio River at the southern tip of Indiana, where his occupation was recorded as “salesman.” This suggests that his fortunes had reversed since 1872, when he had been the moderately prosperous owner-operator of a dredge boat on Mobile Bay. He was also married and the father of three daughters, and the evidence suggests that he had plenty of motivation to leverage his expertise in building secret weapons in the hope of snagging a fortune in the shady private armaments market. By 1877, certainly, McClintock had established contact with two other men who shared these views—George Holgate, a Philadelphian just setting out on what would become an infamous career as a free-lance bomb-maker, and a mysterious New Orleans river pilot by the name of J.C. Wingard, who had been with him in Mobile during the war. Both of these men were extraordinary characters. Holgate, who seems to have been born in lowland Scotland, was the prolific inventor of an alarming collection of elaborate explosive devices that he hawked to all comers—Irish freedom fighters, Cuban patriots and Russian nihilists. “I no more ask a man,” he informed one newspaper reporter, “whether he proposes to blow up a Czar or set fire to a palace...than a gunsmith asks his customers whether they are about to commit a murder.” He claimed to be the former proprietor of a London paint shop that had been a front for a bomb-making business, though there is no trace of any such activities in a British press that became obsessed with bombers when the Irish Republican Brotherhood—a precursor to the IRA—began deploying them in London in 1867. By the early 1870s, Holgate was living in Oshkosh, Wisconsin, where he purchased a gun shop and touted a highly dubious invention that, he boasted, used injections of ozone to keep fruit, vegetables and even beef fresh for weeks. He was, the local Northwestern newspaper would recall, a “blatherskite” and “blowhard...one of those wild erratic individuals who now-a-days are gaining such cheap notoriety by cheap means.” But he was also—potentially, at least—a very dangerous man. The wares that he touted, Ann Larabee records, included a good deal more than conventional explosives: a cheap hand grenade, a bomb concealed in a satchel that had a fuse running through its keyhole, and a hat bomb comprised of dynamite pressed between two sheets of brass sewn into the crown with a fuse running around the rim. His “Little Exterminator” operated through a delicate watch mechanism that moved a tiny saw, releasing a chemical that smelled like cayenne pepper, killing anyone within one hundred feet. Wingard was even more remarkable. When the Civil War disrupted an early sideline as a prominent medium, he too turned to invention, re-emerging in New Orleans in 1876 as the proprietor of a death ray that he claimed was powerful enough to annihilate enemy ships across several miles of open water. Although a self-styled river “captain,” Wingard was almost entirely uneducated—“a plain, simple, straightforward man,” Emma Hardinge wrote in 1870. But he exhibited extraordinary talents as a medium. Amid the great spiritualism craze, which had burst on the United States late in the 1840s, Wingard became renowned as early as 1853 as a faith healer and for the “spirit drawings” he produced in darkened séance rooms “on paper which had previously been examined and found not to contain any marks.” His most remarkable performances, however, involved the production of automatic writing, messages that were supposedly produced by spirits that had taken control of a medium’s body. According to Thomas Low Nichols, the revivalist preacher Jesse Babcock Ferguson swore that he had seen Wingard “write with both hands at the same time, holding a pen in each hand, sentences in different languages, of which he was entirely ignorant. He saw him, as did many other persons of undoubted credibility, write sentences in French, Latin, Greek, Hebrew, and Arabic.” The Civil War found Wingard in New Orleans. Just as the crisis had turned James McClintock’s interests toward bullets, it focused Wingard’s thoughts on an early sort of machine gun. This device was never built, but like the Minié ball machine it was extravagantly promoted. Wingard claimed that weapons made to his design would be capable of discharging 192 bullets a minute “at a range as great as any gun then in use.” Wingard’s interest in mechanical death-dealers persisted after the war, and early in 1876 he reappeared in New Orleans, calling himself “Professor” Wingard and claiming to have invented an astonishing new weapon capable of annihilating enemy warships at distances up to five miles. The manner in which this destruction was to be effected was left vague, though Wingard mentioned electricity—which, in the 1870s, was a new, powerful and poorly understood form of energy—and a separate Nameless Force, which in some mysterious way transmitted electrical power across water and focused it upon its target. This Nameless Force, he promised, would become “a factor controlling the destinies of a nation.” The tremendous public interest in Wingard’s invention survived two unsuccessful efforts to put the Nameless Force to work on Lake Pontchartrain. Chastened by his double failure, Wingard decided not to invite the New Orleans public to a third demonstration on June 1, 1876, but a “committee of gentlemen” was present when, at 2:35 p.m., the Professor—a small figure just visible across a mile or more of water—fired the weapon from a skiff. It was aimed at the Augusta, an old wooden schooner that had been anchored about two miles away, off a popular amusement park on the southern shore known as the Spanish Fort. This time, it seemed, the Nameless Force took effect, and the Augusta “suddenly blew up” about 90 seconds after Wingard’s invention was discharged. When the witnesses reached what remained of the vessel, they found her “shattered in small fragments,” and it seemed all the more impressive that Wingard “could not receive the congratulations of his friends” as he had somehow sustained severe burns to one hand in the course of the operation. From our perspective, though, the most important aspect of the demonstration was not Wingard’s brief lionization in New Orleans, but a deflating coda reported by the Galveston Daily News a few days later. According to that paper, “a delegation of newsboys, who happened to be in the vicinity, with a spirit of scientific research...visited the schooner despite repeated warnings to keep away, and reported that they found a large gas pipe filled with powder, and a wire leading towards [the skiff] that was anchored some distance away.” The entire demonstration, thus, had been a fraud; the only force involved, the News concluded, was a quantity of gunpowder concealed beneath the Augusta’s decks, and a long wire, “tightened by means of a windlass on the skiff,” that triggered the explosive. This discovery dented Wingard’s reputation, and he seems not to have been heard from again until he appeared in Boston late in 1879. What happened to McClintock, Holgate and Wingard in Massachusetts can be established from local newspaper reports. The men appeared in Boston in the first days of October and chartered first the steamboat Edith and then, on October 13, a sailboat, the Ianthe, with a rowboat as tender and a Nantucket man named Edward Swain as crew. On the afternoon of the 13th, Swain sailed the Ianthe to a spot off Point Shirley, to the east of Boston Harbor. It is at this point that accounts become confused, but the most considered and most detailed state that Wingard had taken command of the Edith and was towing an old hulk that was to be used as a target. Holgate, who had been due to join Swain in the tender, complained of seasickness and retreated into the Ianthe’s deckhouse to lie down, so McClintock took his place, carrying with him a “torpedo”—mine—packed with 35 pounds of dynamite, which (the Boston Daily Advertiser reported) he had boasted was powerful enough to “blow up any fleet in the world.” He and Swain rowed off. Shortly thereafter, with the tender about a mile from the Ianthe and two miles from the Edith, there was an ear-shattering explosion. Wingard told the Advertiser that he had been “looking the other way” at the fatal moment but turned in time to see a column of spray and debris rising high into the air. Holgate, who said he had been lying in his bunk, likewise missed the explosion, but when the Ianthe and the Edith converged on the spot there was no trace of McClintock or Swain; all they could see floating on the surface was a mass of splinters. Neither Holgate nor Wingard seems to have been eager to make comments to the press, and both men quickly fled Boston—Holgate after securing McClintock’s possessions from his hotel room and without reporting the incident to the police. “He had a horror of recounting the event,” the Philadelphia Times explained after interviewing the old bomb-maker two decades later, “and so he said: ‘There can’t be an inquest unless there is a body to hold it upon, and there is not even a scrap left of my unfortunate companions.’ ” Indeed, the local authorities took remarkably little interest in what had happened. There seems to be no trace of any real investigation, nor even much curiosity as to why a trio of civilians were experimenting with unregulated explosives. Thus far, the accounts in contemporary newspapers contain nothing to contradict Henry Loughmiller’s belief that his grandfather died that day in Boston. But they offer odd pieces of testimony that do not mesh with the tales that Holgate and Wingard told. The Daily Globe, for instance, reported that Holgate’s involvement in the catastrophe had been greater than he was willing to admit; the “torpedo” was electric, the Globe explained, and the explosion had occurred when Holgate somehow set the charge off remotely. Strangest of all was a note in the same paper stating that a reliable witness—a hunter out shooting at Ocean Spray—had seen McClintock’s rowboat still afloat after the explosion, “so that the men, he thinks, could not have been blown to pieces.” Nothing came of any of this at the time. Holgate hastened to New York, and then home to Philadelphia, wiring McClintock’s family—so he said—to tell them of the awful accident. Wingard vanished. Boston’s harbor police dropped the halfhearted enquiries they had made, and nothing more was heard from any of the participants for more than a year. A good deal happened in the interim, however. Perhaps the most significant of these developments took place in New York, where an ambitious splinter group from an Irish secret society known as the Clan na Gael began to plan a large-scale terrorist campaign on the British mainland. Led by Jeremiah O'Donovan Rossa, an Irish journalist who had been elected “Head Centre” of the Fenian movement in the United States, it began raising funds and looking for ways to manufacture bombs and smuggle them across the Atlantic. O’Donovan Rossa and his associates were nothing if not ambitious—they raised $43,000 (just over $1 million today) with the aim of spreading “terror, conflagration and irretrievable destruction” the length and breadth of England, and established a “Dynamite School” in Brooklyn to teach recruits how to make, conceal and use their bombs. But Rossa was also enduringly indiscreet about their plans, and by the fall of 1880—a year after the explosion in Boston, but months before their terror campaign was due to begin—British diplomats in the United States were in a state of high alert, and desperately seeking information about how Rossa planned to spend his money. It was against this background that Robert Clipperton, the British consul at Philadelphia, received an unexpected visitor in October 1880. This man introduced himself as James McClintock, explained that he had a background in submarine and mine warfare—and revealed that he had been hired by Rossa’s Skirmishing Fund to build 15 examples of a new sort of torpedo so powerful that a single weapon filled with 35 pounds of explosives “could sink an ironclad if exploded under her bottom, and could be carried in a great-coat pocket.” This McClintock’s purpose in calling on Clipperton was to offer his services as a double agent. In exchange for payments of $200 ($4,650 today) each month, he was willing to betray his employers, slow down the work, hand over samples of the weapons and guarantee not to supply working models to Rossa’s terrorists. Clipperton was impressed by his visitor, and so were the consul’s masters at the British embassy at Washington. The British naval attaché, Captain William Arthur, arrived posthaste in Philadelphia, where on November 5 he met with McClintock and recommended his recruitment as a spy. The weapons, Arthur wrote, seemed viable, and the informant’s plans were workable—the doubt was his loyalty, not his truthfulness. As a result of this report, the man calling himself McClintock was paid $1,000, and Clipperton and his assistant, George Crump, continued to meet with him well into 1881. That March, the consul was handed samples of three different sorts of bomb—one disguised as a lump of coal and intended to be slipped into the bunkers of a transatlantic steamship, to explode with catastrophic consequences when it was shoveled into a furnace while the ship was out at sea. But who was the man whose appearance in Philadelphia caused Clipperton’s diplomats so much concern? Nothing in the official correspondence—lodged today in Britain’s National Archives—contains a physical description of the informant. But we can say he was just as traitorous as he appeared to be. By the time the official record petered out, in July 1881, he had extracted a four-figure sum from both Rossa’s Irish freedom fighters and Queen Victoria’s secret service fund. Moreover, he had betrayed both of his employers. Rossa never received his final consignment of torpedoes, and the samples that McClintock supplied to the British were fakes—“the contents of his cases are not dynamite,” a worried official reported from London when the test results came in, “but a powder made to resemble it of a very slightly explosive quality.” This James McClintock slipped away before either the British or the Fenians could lay hands on him. It seems that he was never heard from again. So who was the Philadelphia McClintock? There are certainly problems with the idea that he was the same man who was supposed to have died in Boston in 1879. That McClintock never returned to his family. He was listed as dead—killed in Boston—in the mortality schedule for 1880 that was compiled in his hometown in Indiana, and his grandson knew of nothing to suggest this was not true. And Holgate was vividly retelling the story of McClintock’s atomization as late as 1896. One possibility is that Clipperton’s informant was Holgate, posing as his old partner. A few details suggest that this might be the case. One is that “McClintock” chose to reappear in Philadelphia—which was, by 1880, Holgate’s home. The other is that the man who turned up at the British consulate explained that his device contained 35 pounds of explosives. Perhaps not coincidentally, that was precisely the size of the device that Holgate told the Boston press had blown up James McClintock. But would Holgate really have had much to gain by posing as his former partner? True, Holgate was no expert in underwater warfare, while McClintock was. But McClintock’s name would have carried no weight with any British diplomat in 1880. His role as the designer of the Hunley had never been disclosed. His visit to Canada had remained a state secret. And it would not be until well into the next century that his role in the destruction of the Housatonic would be celebrated. The only other plausible alternative is that the Philadelphia man was exactly who he claimed to be. Of course, for McClintock to have survived the explosion in Boston, he would have had to fake his death—and probably to become a murderer as well, for the unfortunate Edward Swain was never seen again. He would surely have needed a good reason to take these drastic steps, and it is possible to speculate that he had one—by the time he got to Boston, he was definitely short of money, and a spectacular apparent death might have seemed a good way to escape his creditors, or perhaps an angry backer calling in a loan. In the final analysis, however, we cannot be certain that McClintock was desperate, and there are really only two ways to determine whether Clipperton’s informant was the man he said he was. One is to ask whether the events of 1879 make any sense viewed as a fraud. The other is to search the British archives for scraps of information that could have been provided only by the real McClintock. Certainly it strains credulity to suppose that McClintock rigged an explosion and then made a clean getaway without assistance from Wingard or Holgate. It would have been all but impossible for him to have escaped the scene without being noticed by one of them. And that the two men might have helped McClintock fake his death is not implausible; neither was a paragon of decency. But it is hard to imagine what their motive might have been, unless McClintock was their boss and paying them. Holgate’s accounts do suggest that his partner was the man in charge. But a clue buried in the Boston Daily Advertiser suggests that this was not the case. According to the Advertiser’s files, Wingard lodged at the United States Hotel, McClintock and Holgate at the Adams House. Since the United States was Boston’s second-best hotel, while the Adams House was a theater district dive, the implication is that it was Wingard who had hired the other two. This certainly ties in with a note that appeared weeks later in the Chicago Daily Tribune, which reported that Wingard had traveled to Boston to stage another fraudulent trial of his Nameless Force for the benefit of fresh investors, and that he spent the first half of October assembling a joint stock company willing to plough $1,500 into his venture. The explosion put an end to that (the Tribune wrote), and a shaken Wingard confessed to his investors that the blast had taken place while two of his men were on their way to install hidden charges on the hulk selected for his demonstration. But if Wingate had no motive to assist McClintock, the same may not have been true of George Holgate. In this scenario, McClintock simply stayed on board the Ianthe with his partner and sent Swain off to die in the rowboat. The fact that the explosive charge was designed to be detonated remotely by wire, just as it had been in New Orleans, adds some weight to this theory, for if Swain rowed off trailing cable, as he must have done, the charge could have been detonated at any point—and, as the Boston Globe alleged, the explosion could have been triggered by Holgate. All McClintock needed to do at that point was to stay below while the Ianthe and the Edith converged upon the fatal spot. Wingard would have been none the wiser, McClintock would have escaped his creditors, and Holgate would have been owed favors by a man with extensive experience of explosives and underwater warfare. Bearing all this in mind, perhaps the salient point is this: the Philadelphia McClintock was able to convince the British naval attaché, Captain Arthur, that he knew all about mines and submarines. This would not have been an easy trick to pull, for Arthur was also an expert; his last posting before coming to America was as Captain of HMS Vernon, the Royal Navy’s chief research establishment for underwater warfare. So maybe, just maybe, the triple agent who tricked British officials and Irish terrorists in Philadelphia, and got away with $2,000 and his life, was precisely who he said he was: James R. McClintock, inventor of the H.L. Hunley, betrayer of countries, causes, friends and his own family, and the faker of his own strange death. Sources British National Archives: Admiralty papers. “Submarine warfare,” 1872, Adm 1/6236 part 2; “Fenian schemes to employ torpedoes against HM ships,” 1881, Adm 1/6551; digest for August 9, 1872 and October 19, 1872 at cut 59-8 of Adm 12/897; digest for February 8, 1873 at cut 59-8 of Adm 12/920. Foreign Office Papers. New Orleans consulate. Cridland dispatch no.2 commercial of April 5, 1872 enclosing statement by James McClintock, March 30, 1872, and Cridland to Foreign Office July 17, 1872, both in FO5/1372; Fanshawe to Cridland, December 20,1872, Cridland dispatch no.7 commercial of January 3, 1873, McClintock to Cridland, January 7, 1873, Cridland to Foreign Office, May 25, 1873, all in FO5/1441. Philadelphia consulate. Political correspondence for 1881 in FO5/1746 fols.100-02, 146-7; FO5/1776, fols. 65-71, 80-5, 247, 249, 265, 291; FO5/1778 fols. 289, 403; United States censuses 1860 and 1870; Eustace Williams, "The Confederate submarine Hunley documents," np, Van Nuys, California, 1958, typescript in the New York Public Library; Anon. "Some scientific hoaxes." In  Chambers's Journal of Popular Literature, Science, and Art, June 12, 1880; Victor M. Bogle. “A view of New Albany society at mid-Nineteenth Century.” In Indiana Magazine of History 54 (1958); Boston Daily Advertiser, October 15, 16, and 20, 1879; Boston Evening Transcript, October 15, 1879; Boston Daily Globe, October 14, 15, 16 and 20, and November 17, 1879; Boston Weekly Globe, October 21, 1879; Carl Brasseaux & Keith P. Fortenot. Steamboats on Louisiana's Bayous: A History and Directory. Baton Rouge: Louisiana State University Press, 2004; Chicago Daily Tribune, November 14, 1879; Mike Dash. British Submarine Policy 1853-1918. Unpublished PhD thesis, University of London 1990; Esther Dole. Municipal Improvements in the United States, 1840-1850. Unpublished PhD thesis, University of Wisconsin 1926; Ruth Duncan. The Captain and Submarine H.L. Hunley. Memphis: privately published, 1965; Charles Dufour. The Night the War was Lost. Lincoln NE: Bison Books, 1964; Eaton Democrat (OH), June 20, 1876; Floyd County, Indiana, mortality schedule, 1880; Galveston Daily News, June 6, 1876; Emma Hardinge. Modern American Spiritualism: A Twenty Years' Record. New York: The Author, 1870;  Chester Hearn. Mobile Bay and the Mobile Campaign: the Last Great Battles of the Civil War. Jefferson [NC]: McFarland & Co., 1993; Ann Larabee. The Dynamite Fiend: The Chilling Tale of a Confederate Spy, Con Artist, and Mass Murderer. New York: Palgrave Macmillan, 2005; New Orleans Daily Democrat, March 22, 1877;  New Orleans Times-Picayune, May 12+May 30+June 4, 1876;  New Orleans Daily Times, October 15, 1879; Thomas Low Nichols. Supramundane Facts in the Life of Rev. Jesse Babcock. London: F. Pitman, 1865; Oshkosh Daily Northwestern, March 21, 1883; Ouachita Telegraph [LA], November 14, 1879; Philadelphia Times, February 26, 1896; Mark Ragan. Union and Confederate Submarine Warfare in the Civil War. Boston: Da Capo Press, 1999; Mark K. Ragan. The Hunley. Orangeburg [SC]: Sandlapper Publishing, 2006; KRM Short. The Dynamite War: Irish-American Bombers in Victorian Britain. Atlantic Highlands [NJ]: Humanities Press, 1979; Niall Whelehan. The Dynamiters: Irish Nationalism and Political Violence in the Wider World, 1867-1900. Cambridge: Cambridge University Press, 2012. Mike Dash is a contributing writer in history for Smithsonian.com. Before Smithsonian.com, Dash authored the award-winning blog A Blast From the Past.
345c828ded075b026c2862c811ea5aa2
https://www.smithsonianmag.com/history/amazon-or-independence-hall-development-v-preservation-city-philadelphia-180967463/
Two Centuries Ago, Pennsylvania Almost Razed Independence Hall to Make Way for Private Development
Two Centuries Ago, Pennsylvania Almost Razed Independence Hall to Make Way for Private Development Goodbye Independence Hall, hello Amazon headquarters! That was the “news” recently spoofed by the popular parody website, The Onion. The article lampooned Philadelphia’s eagerness to house Amazon’s second command center and included an image of the city leveled to make way for new business. “It was definitely bittersweet saying goodbye to the Liberty Bell,” says the satirical version of Mayor Jim Kenney, “but it’s important we encourage businesses to invest in the city.” The article’s humor arises, in part, from treating one of the nation’s most cherished historic monuments as prime real estate. Yet 200 years ago, Philadelphians faced this very situation when the commonwealth of Pennsylvania planned to subdivide the site for private development. The resulting campaign to preserve Independence Hall featured the same critiques of urban development, capitalist greed, and corrupt public interest that appeared in The Onion two centuries later. Since then, observers have viewed Independence Hall as a bellwether of the values guiding urban development. Their commentary reminds us that citizens long have shaped historic sites not simply to commemorate the past but also to define what should not be for sale during times of economic transition. Independence Hall’s preservation began in 1812 when Pennsylvania legislators planned to sell the building — then known as the old statehouse — and carve the surrounding green space into building lots. Colonial legislators had met in the building for four decades before American patriots made the spot notorious by signing the Declaration of Independence and debating the U. S. Constitution under its roof. After the state government removed its seat to Harrisburg in 1799, however, legislators saw the building and its surrounding land as potential revenue. Architectural salvage from the demolished building and multiple lots sold “to the highest and best bidders” would raise money to build a grand statehouse in the new capital. Philadelphia’s municipal leaders valued the site in a different way. The age of the building and the grounds surrounding it, they argued, did not make the site ripe for development. The civic value of the place outweighed any financial profit that development would bring. In other words, permanence of this prime real estate would serve the public good. The city government offered to buy the site from the commonwealth for $70,000. State legislators refused, insisting that they would not settle for less than $150,000. Thus began a five-year campaign to preserve the old statehouse and its grounds as city property. City councillors first challenged the legality of development. Since 1735, colonial legislation had mandated that none of the open space around the statehouse “be converted into or made use of for erecting any sort of building thereupon, but that the said ground shall be inclosed, and remain a publick open green and walk for ever.”[2] Yet editorials and city council reports made arguments for the public good of open space and historic structures. They described the statehouse yard as a crucial source of air, light, and recreation for a growing urban populace. It also provided space for electioneering, they argued, which ensured the political health of the city and the nation. The historic features of the old statehouse contributed to civic health as well. They substantiated direct associations with the nation’s founding and formed an irreplaceable monument to a watershed moment in world history. Philadelphia’s leaders argued that when commonwealth officials demanded the maximum market price, they betrayed the public interest they claimed to represent. The land’s market value had increased substantially in the speculative real estate economy of the early nineteenth century, and a handful of political elites stood to profit at the expense of Philadelphia’s residents. Market growth, city officials argued, did not always engender urban improvement. Philadelphia’s City Council won out. In 1818, they took possession of the old statehouse and its yard. Their campaign for stewarding the site as permanent public space had helped to generate the political capital necessary to negotiate a sale on their terms. It also made Independence Hall a symbol for municipal leaders’ care for the city’s welfare. Subsequent generations revived the threat of Independence Hall’s demolition as a means to criticize municipal leadership. In 1845, George Lippard wrote a popular novel, The Quaker City, that depicted Philadelphia’s municipal leaders as wealthy men who exploited women, impoverished workers, and public trust for their own gain. In Lippard’s dystopian narrative, these men replaced Independence Hall with a gilt palace and surrounded it with new buildings. As Philadelphia leaders remade the city to stimulate industry and commerce, Lippard used the demolition of Independence Hall to question who benefitted from these changes. Municipal leaders, in turn, pointed to the preservation of Independence Hall as a sign of their public-mindedness. In the mid-20th century, urban planner Edmund Bacon introduced a plan to revitalize deindustrializing Philadelphia with attention to 18th-century architecture. He placed Independence Hall at the center of his plan to cultivate a tourist economy and leveled several blocks of 19th-century commercial buildings to open a dramatic vista of the building from Independence Mall. In this vision of preservation and redevelopment, Philadelphia could profit as a steward of national heritage. City leaders made the same argument when they advocated for Unesco World Heritage designation for the site in 1979 and the city in 2015. When Onion writers depicted the mayor’s destruction of Independence Hall in 2017, they continued this conversation for a new generation facing economic and urban change. In recent weeks, Philadelphia’s municipal leaders have pulled out all the stops to pitch their city as the perfect locale for Amazon’s second headquarters. The “Philadelphia Delivers” campaign has spread glowing images of the city across a slick website and promotional video. It even bought advertising in the Seattle transit system. In this campaign, Philadelphia boosters highlight the city’s open space as a key feature of its appeal. Sites opened by the decline of industry — railyards by the Schuylkill River and South Philadelphia’s Navy Yard — await more productive uses in the new economy, they say. Like the first writers who argued for the preservation of Independence Hall two hundred years ago, the Onion writers pushed back against the notion that old buildings and open space mark sites ripe for new development. The history of Independence Hall’s preservation shows how old this argument really is. As Philadelphia faces a preservation crisis that could be intensified by the arrival of Amazon, Independence Hall recalls the city’s early commitment to the public protection of open space, historic buildings, and the sense of place defined by current city residents. Members of Mayor Kenney’s Historic Preservation Task Force, as well as all Philadelphians, would be wise to consider this legacy as they articulate a plan for managing the relationship between private development and civic health. Just as activists have used Independence Hall as a symbol for the expansion of civil rights, today’s city leaders might harness the ideals of public space embodied by this site to increase the number of sites that get preserved and the types of citizens involved in selecting them. In this way, Independence Hall can serve as an icon not only of the ideals of human equality but also of the city’s mandate to steward historic resources for all residents. Whitney Martinko’s column for the Lepage Center this fall focuses on how early histories of historic preservation in the U.S. can inform debates over historic monuments today. She is an assistant professor of History at Villanova University.
a07b3bac0de3ca0886ba4da449de20ce
https://www.smithsonianmag.com/history/americans-illegally-immigrated-mexico-180973306/
When Mexico’s Immigration Troubles Came From Americans Crossing the Border
When Mexico’s Immigration Troubles Came From Americans Crossing the Border During the two centuries in which Mexico and the United States have shared a border, allegations of out-of-control immigration, with Mexican immigrants posing a threat to American security, have often been a staple of American politics and a source of friction and concern. But the worry has worked both ways. In the immediate wake of Mexico’s successful war for independence from Spain, Mexican officials grew alarmed about illegal immigration from the United States. By the late 1820s, the situation on the border, located on the Sabine River that today separates Texas from Louisiana, appeared to be reaching the crisis stage. Not all the immigration from the north was illegal. Starting in 1821, Stephen F. Austin, the man later referred to as “The Father of Texas,” brought hundreds of American immigrants to Mexican Texas, with the explicit approval and support of the Mexican government. The government, in order to populate the thinly settled province and build a bulwark against Indian attacks, promised the immigrants land—far more land than most of them would ever have acquired in the United States. At a time when a farm in Tennessee might encompass a quarter-section, or 160 acres, the standard grant in Austin’s Texas colony was a league, or more than 4,000 acres. In Dreams of El Dorado, H. W. Brands tells the thrilling, panoramic story of the settling of the American West. He takes us from John Jacob Astor's fur trading outpost in Oregon to the Texas Revolution, from the California gold rush to the Oklahoma land rush. He shows how the migrants' dreams drove them to feats of courage and perseverance that put their stay-at-home cousins to shame-and how those same dreams also drove them to outrageous acts of violence against indigenous peoples and one another. A common concern of farmers in the United States was that their children, upon reaching adulthood, would have to move away to find farms of their own. Lack of land was one of the principal forces driving America’s westward movement. In Texas, an American colonist could receive land enough for all his children and their children. Mexican law made princes of many who might have been paupers in the United States. But many of the Americans who came to Texas did so beyond the auspices of Mexican law. By handfuls at first, then by scores, then by hundreds and thousands, Americans poured into Texas illegally. They seized whatever land parcels weren’t occupied and made them their homes. Mexican officials were few in Texas, and they were distracted by the turbulence that roiled Mexican politics in the aftermath of the war against Spain. The squatters could be in place for months or years before the government took notice. By then, the squatters thought of the land as their own, and they didn’t hesitate to defend it with deadly force. Within several years of Austin’s arrival in Texas, the situation in Texas was spinning out of Mexico’s control. The government appointed a commission to examine the Texas question; at its head was Manuel de Mier y Terán, a general in the Mexican army, a former government minister and member of the Mexican congress, an engineer and a scientist. Terán reached Texas in the spring of 1828 and spent the next several months traveling about the settled regions. He visited San Antonio, which remained thoroughly Mexican. But farther east the American influence took hold. His party crossed the Guadalupe River. “On the eastern bank of this river there are six wooden cabins, whose construction shows that those who live in them are not Mexicans,” Terán wrote. “Though the house is a single piece, it has two rooms, a high one and a low one. In the latter is found the storeroom and kitchen, whose chimney sticks up on the outside, and in the higher part are the bedroom and living room.” The Americans at first seemed standoffish. “I approached a cabin in hopes that its owner might offer me shelter, but it was in vain,” Terán wrote. “I learned later that the North Americans are not used to making such invitations. One arrives quite naturally, sure of being well received. But if one stops at the door, no one encourages him to come inside.” Certain other Americans were as pleasant as could be. Terán’s party crossed the Colorado River on a ferry owned by an American named Beeson. “He is quite urbane, his family very honorable,” Terán noted. “Their services were very helpful to us.” Beeson’s wife had learned enough Spanish to explain how well Texas suited them. They had built a cabin and expanded their herd of cattle. “Madame says they have 1,200 pesos in savings. They have been on this land for five years, and they speak with great satisfaction of its fertility and good climate. In a word, they seem happy.” Terán got to Austin’s colony, on the Brazos River, in late April. He was highly impressed with the energy and productivity of the immigrants, reckoning the colony’s annual corn crop at 64,000 bushels and the cotton crop at 240,000 pounds. Most of the former and essentially all of the latter were exported, as were mules that the Americans raised for sale in the West Indies. Terán had expected to see self-sufficient farms; what he found instead was a hive of commercial enterprise. Terán asked the Americans what had brought them to Texas. Many mentioned the Texas climate. “To the north the freezing temperatures and snows create obstacles to their work for several months and force them to labor harder. In Texas they work year-round and therefore in greater moderation. In winter they clear and prepare the land that they will plant in the spring.” Others referred to the Mexican markets for the crops they raised. “In the north”—of Mexico—“agricultural production outstrips demand, and the prices are exceedingly low. The colonists hope for greater appreciation in the ports and on the coast of Mexico.” The Americans had big goals. “They hope to take over the supply of flour, grains, and meats in the ports.” The residents of Austin’s colony had the makings of solid Mexican citizens, Terán allowed, even if they clung to their American ways. He couldn’t say the same about the Americans he encountered farther east. Nacogdoches, 50 miles from the border, marked the beginning of a kind of no-man’s-land that stretched to the Sabine River and American-owned territory. The inhabitants put even Terán’s party of soldiers on guard. “A great number of the foreigners who have entered the frontier are vicious and wild men with evil ways,” Terán wrote. “Some of them are fugitive criminals from the neighboring republic; within our borders they create disturbances and even criminal acts.” The United States and Mexico had not worked out border enforcement and extradition rules. “The inhabitants take advantage of their friends and companions to attack and to defend themselves and cross from one side to the other in order to escape punishment.” At Nacogdoches, Terán reflected on what he had seen. “As one travels from Béxar”—San Antonio—“to this town, Mexican influence diminishes, so much so that it becomes clear that in this town that influence is almost nonexistent,” he wrote. “But where could such influence come from? Not from the population, because the ratio of the Mexican population to the foreign is one to ten; nor from its quality, because the population is precisely the contrary: the Mexicans of this town consist of what people everywhere call the abject class, the poorest and most ignorant.” The Americans in Nacogdoches operated an English-language school for their children. “The poor Mexicans neither have the resources to create schools, nor is there anyone to think about improving their institutions and their abject condition.” As a result, English had become the language of the region, and American influence appeared to be its future. So what was to be done with the American immigrants? How to stem the invasion? Terán saw no easy answers. “Nature tells them that the land is theirs,” Terán wrote, “because, in effect, everyone can appropriate what does not belong to anyone or what is not claimed by anyone. When the occasion arises, they will claim the irrefutable rights of first possession.” Terán acknowledged that the legal immigrants of Austin’s colony were a different sort than the illegals of the border region. But he wasn’t sure that this made the future of Mexican Texas any more secure. “I must say in all frankness that everyone I have talked to here who is aware of the state of the country and devoted to its preservation is convinced, and has convinced me, that these colonies, whose industriousness and economy receive such praise, will be the cause for the Mexican federation to lose Texas unless measures are taken soon.” What kind of measures? First, a stronger military presence. “On the frontier there are intrigues,” Terán wrote. And lest the intrigues become rebellions, Mexico needed more soldiers in Texas. Second, immigration should be suspended until it could be controlled. The border must be policed and illegal immigrants deported. Third, and most important, Mexico needed to make Texas truly Mexican, before the Americans made it irretrievably American. “The land of Texas, or at least its eastern part where its principal rivers begin to be navigable, should be reserved for Mexican settlers,” Terán declared. He didn’t advocate removing legal immigrants like those in Austin’s colony, but any new settlers must come from Mexico, not from the United States. Terán proposed that the government provide incentives to attract five thousand Mexicans to the Trinity River, to act as a bulwark against the Americans. His plan would be expensive, Terán conceded. But he saw no other choice. If current trends persisted, Texas would be lost. Adapted from Dreams of El Dorado: A History of the American West by H.W. Brands. Copyright © 2019 by H.W. Brands. Available from Basic Books on October 22. H. W. Brands holds the Jack S. Blanton Sr. chair in history at the University of Texas at Austin. A New York Times-bestselling author, he was a finalist for the Pulitzer Prize in biography for Traitor to His Class. His new book is Dreams of El Dorado: A History of the American West. He lives in Austin, Texas.
17eb0fedb07a382c50a0e971b37064e2
https://www.smithsonianmag.com/history/americans-who-saw-lady-liberty-false-idol-broken-promises-180972285/
The Americans Who Saw Lady Liberty as a False Idol of Broken Promises
The Americans Who Saw Lady Liberty as a False Idol of Broken Promises It was a crisp, clear fall day in New York City, and like many others, Lillie Devereaux Blake was eager to see the great French statue, donated by that country’s government to the United States as a token of friendship and a monument to liberty, finally unveiled. President Grover Cleveland was on Bedloe’s Island (since renamed Liberty Island), standing at the base of the statue, ready to give a speech. Designed in France, the statue had been shipped to New York in the spring of 1885, and now, in October 1886, it was finally assembled atop its pedestal. “Presently the veil was withdrawn from her beautiful calm face,” wrote Blake of the day’s events, “and the air was rent with salvos of artillery fired to hail the new goddess; the earth and the sea trembled with the mighty concussions, and steam-whistles mingled their shrill shrieks with the shouts of the multitude—all this done by men in honor of a woman.” Blake wasn’t watching from the island itself, though—in fact, only two women had been invited to the statue that day. Blake and other members of the New York State Women’s Suffrage Association, at that point New York’s leading women’s suffrage organization, had chartered their own boat in protest of the exclusion of women not just from the statue’s unveiling, but from the idea of liberty itself. Blake’s protest is one of several highlighted at the new Statue of Liberty Museum, which opened earlier this month on Liberty Island. While the statue’s pedestal did at one point hold a small museum, the new space’s increased square footage allowed historians and exhibit designers to expand the story of Lady Liberty, her champions and her dissenters. “In certain people's retelling of the statue and certain ways it gets told, it often seems like there's a singular notion, whether it's the statue as a symbol of America or the statue as the New York icon or the statue as the beacon of immigration,” says Nick Hubbard, an exhibition designer with ESI Designs, the firm responsible for the staging of the new museum. But as the newspaper clippings, broadsheets, and images in the space themselves explain, the statue—and what it symbolized—wasn’t universally beloved, and to many, it was less a beacon of hope than an outright slap in the face. * * * The French bequeathed the statue itself as a gift, but it was up to the people of America to supply it with a pedestal. After both the state of New York and the federal government declined to fund the project, New York World publisher Joseph Pulitzer announced he would use his paper to raise $100,000 (more than $2 million in today’s currency) for the pedestal. The proposition was straightforward: Mail in a donation, get your name printed in the paper. Stories abounded of small children and elderly women sending in their allowances and their spare change, and the heartwarming tales of common folk supporting the grand project captured the front pages of Pulitzer’s paper and the imagination of the country, largely cementing the idea that the Statue of Liberty was, from the beginning, universally beloved by Americans. Immediately, though, cracks emerged in this façade. Blake and the nearly 200 other women who sailed to Bedloe’s Island issued a proclamation: “In erecting a Statue of Liberty embodied as a woman in a land where no woman has political liberty, men have shown a delightful inconsistency which excites the wonder and admiration of the opposite sex,” they pointed out. President Cleveland, during his speech, took no notice of the women floating directly below him, Blake brandishing a placard bearing the statement “American women have no liberty.” Suffragists around the country, however, noticed, and the statue for them became both a symbol of all they didn’t yet have and a rallying point for demanding it. In later decades, Susan B. Anthony and Elizabeth Cady Stanton visited the statue, and after a 1915 measure to give women the right to vote in New York failed at the ballot box, a group of suffragists used a 1916 visit by Woodrow Wilson to drop thousands of ‘Votes For Women!’ leaflets at the statue via biplane. The statue’s unveiling dominated headlines for weeks before and after the official date, and the Cleveland Gazette, an African-American-run newspaper with a circulation of 5,000, was no exception. On November 27, 1886, a month after the statue opened to the public, their front page ran an editorial titled “Postponing Bartholdi's statue until there is liberty for colored as well.” “Shove the Bartholdi statue, torch and all, into the ocean,” the Gazette argued, “until the ‘liberty’ of this country is such as to make it possible for an inoffensive and industrious colored man in the South to earn a respectable living for himself and family, without being ku-kluxed, perhaps murdered, his daughter and wife outraged, and his property destroyed. The idea of the ‘liberty’ of this country ‘enlightening the world,’ or even Patagonia, is ridiculous in the extreme.” Hubbard says including a section of the Gazette editorial in the exhibit was crucial to communicating that the Statue of Liberty posed—and still poses—an ongoing series of questions about American values. “We really had to set up the idea that the statue is sort of a promise, it represents and is a symbol of basic American and foundational American ideas,” he says. “It sets up that promise but then even from the beginning there are people who say, ‘But wait, that promise is not necessarily fulfilled.’” While the Statue of Liberty has, for most of its time in New York’s harbor, been framed as a symbol of immigration in America, at the time of its assembly, the country was just beginning to formally limit the number of people who could immigrate each year. In 1882, the federal government passed the Chinese Exclusion Act, the first large-scale immigration law and one that explicitly made the case for prioritizing—and restricting—immigrants based on race. Chinese-American writer Saum Song Bo responded to the Pulitzer solicitations of funds for the statue’s pedestal by sending a letter to the New York Sun: “I consider it as an insult to us Chinese to call on us to contribute toward building in this land a pedestal for a statue of Liberty,” Bo wrote. “That statue represents Liberty holding a torch which lights the passage of those of all nations who come into this country. But are the Chinese allowed to come? As for the Chinese who are here, are they allowed to enjoy liberty as men of all other nationalities enjoy it? Are they allowed to go about everywhere free from the insults, abuse, assaults, wrongs and injuries form which men of other nationalities are free?” It’s this idea that “liberty” is far from a fixed word with a fixed meaning that lies at the heart of the Statue of Liberty Museum’s experience. “When the designers were thinking of the statue, of course how people interpreted liberty and what it meant was already very complicated and contested,” says Hubbard. Incorporating those perspectives in the exhibit allows the space to make the point that now, more than 100 years after the Statue of Liberty’s torch first alighted, Lady Liberty still stands over New York harbor as a symbol of where the nation has come and how far it still has to go. Angela Serratore is a writer and a contributing editor at Smithsonian.com
41d3cc55a419bf8da3042e2b76ff9138
https://www.smithsonianmag.com/history/americas-big-circus-spectacular-has-long-and-cherished-history-180962621/
America’s Big Circus Spectacular Has a Long and Cherished History
America’s Big Circus Spectacular Has a Long and Cherished History When Barnum and Bailey’s “Greatest Show on Earth” rolled into American towns in the 1880s, daily life abruptly stopped. Months before the show arrived, an advance team saturated the surrounding region with brilliantly colored lithographs of the extraordinary: elephants, bearded ladies, clowns, tigers, acrobats and trick riders. On “Circus Day,” huge crowds gathered to observe the predawn arrival of “herds and droves” of camels, zebras, and other exotic animals—the spoils of European colonialism. Families witnessed the raising of a tented city across nine acres, and a morning parade that made its way down Main Street, advertising the circus as a wondrous array of captivating performers and beasts from around the world. For isolated American audiences, the sprawling circus collapsed the entire globe into a pungent, thrilling, educational sensorium of sound, smell and color, right outside their doorsteps. What townspeople couldn't have recognized, however, was that their beloved Big Top was also fast becoming a projection of American culture and power. The American three-ring circus came of age at precisely the same historical moment as the U.S. itself. Three-ring circuses like Barnum and Bailey's were a product of the same Gilded Age historical forces that transformed a fledgling new republic into a modern industrial society and rising world power. The extraordinary success of the giant three-ring circus gave rise to other forms of exportable American giantism, such as amusement parks, department stores, and shopping malls. The first circuses in America were European—and small. Although circus arts are ancient and transnational in origin, the modern circus was born in England during the 1770s when Philip Astley, a cavalryman and veteran of the Seven Years War (1756-1763), brought circus elements—acrobatics, riding, and clowning—together in a ring at his riding school near Westminster Bridge in London. One of Astley’s students trained a young Scotsman named John Bill Ricketts, who brought the circus to America. In April of 1793, some 800 spectators crowded inside a walled, open-air, wooden ring in Philadelphia to watch the nation’s first circus performance. Ricketts, a trick rider, and his multicultural troupe of a clown, an acrobat, a rope-walker, and a boy equestrian, dazzled President George Washington and other audience members with athletic feats and verbal jousting. Individual performers had toured North America for decades, but this event marked the first coordinated performance in a ring encircled by an audience. Circuses in Europe appeared in established urban theater buildings, but Ricketts had been forced to build his own wooden arenas because American cities along the Eastern Seaboard had no entertainment infrastructure. Roads were so rough that Ricketts' troupe often traveled by boat. They performed for weeks at a single city to recoup the costs of construction. Fire was a constant threat due to careless smokers and wooden foot stoves. Soon facing fierce competition from other European circuses hoping to supplant his success in America, Ricketts sailed for the Caribbean in 1800. While returning to England at the end of the season, he was lost at sea. After the War of 1812, American-born impresarios began to dominate the business. In 1825, Joshua Purdy Brown, a showman born in Somers, New York, put a distinctly American stamp on the circus. In the midst of the evangelical Second Great Awakening (1790-1840), an era of religious revivalism and social reform, city leaders in Wilmington, Delaware banned public amusements from the city. Brown stumbled upon the prohibition during his tour and had to think fast to outwit local authorities, so he erected a canvas “pavilion circus” just outside the city limits. Brown’s adoption of the canvas tent revolutionized the American circus, cementing its identity as an itinerant form of entertainment. Capital expenses for tenting equipment and labor forced constant movement, which gave rise to the uniquely American one-day stand. On the frontier edges of society, entertainment-starved residents flocked to the tented circus, which plodded by horse, wagon, and boat, pushing westward and southward as the nation’s borders expanded. The railroad was the single most important catalyst for making the circus truly American. Just weeks after the completion of the Transcontinental Railroad in May 1869, Wisconsin showman Dan Castello took his circus—including two elephants and two camels—from Omaha to California on the new railroad. Traveling seamlessly on newly standardized track and gauge, his season was immensely profitable. P.T. Barnum, already a veteran amusement proprietor, recognized opportunity when he saw it. He had set a bar for giantism when he entered the circus business in 1871, staging a 100-wagon “Grand Traveling Museum, Menagerie, Caravan, and Circus.” The very next year, Barnum’s sprawling circus took to the rails. His partner William Cameron Coup designed a new flatcar and wagon system which allowed laborers to roll fully loaded wagons on and off the train. Barnum and Coup were outrageously successful, and their innovations pushed the American circus firmly into the combative scrum of Gilded Age capitalism. Before long, size and novelty determined a show’s salability. Rival showmen quickly copied Barnum’s methods. Competition was fierce. Advance teams posting lithographs for competing shows occasionally erupted in brawls when their paths crossed. In 1879, James A. Bailey, whose circus was fresh off a two-year tour of Australia, New Zealand, and South America, scooped Barnum when one of his elephants became the first to give birth in captivity at his show’s winter quarters in Philadelphia. Barnum was begrudgingly impressed—and the rivals merged their operations at the end of 1880. Like other big businesses during the Gilded Age, the largest railroad shows were always prowling to purchase other circuses. Railroad showmen embraced popular Horatio Alger “rags-to-riches” mythologies of American upward mobility. They used their own spectacular ascent to advertise the moral character of their shows. Bailey had been orphaned at eight, and had run away with a circus in 1860 at the age of 13 to escape his abusive older sister. The five Ringling brothers, whose circus skyrocketed from a puny winter concert hall show in the early 1880s to the world’s largest railroad circus in 1907, were born poor to an itinerant harness maker and spent their childhood eking out a living throughout the Upper Midwest. These self-made American impresarios built an American cultural institution that became the nation’s most popular family amusement. Barnum and Bailey's big top grew to accommodate three rings, two stages, an outer hippodrome track for chariot races, and an audience of 10,000. Afternoon and evening performances showcased new technologies such as electricity, safety bicycles, automobiles, and film; they included reenactments of current events, such as the building of the Panama Canal. By the end of the century, circuses had entertained and educated millions of consumers about the wider world, and employed over a thousand people. Their moment had come. In late 1897, Bailey took his giant Americanized circus to Europe for a five-year tour, just as the U.S. was coming into its own as a mature industrial powerhouse and mass cultural exporter. Bailey transported the entire three-ring behemoth to England by ship. The parade alone dazzled European audiences so thoroughly that many went home afterwards mistakenly thinking they had seen the entire show. In Germany, the Kaiser’s army followed the circus to learn its efficient methods for moving thousands of people, animals, and supplies. Bailey included patriotic spectacles reenacting key battle scenes from the Spanish-American War in a jingoistic advertisement of America’s rising global status. Bailey’s European tour was a spectacular success, but his personal triumph was fleeting. He returned to the United States in 1902 only to discover that the upstart Ringling Brothers now controlled the American circus market. When Bailey died unexpectedly in 1906, and the Panic of 1907 sent financial markets crashing shortly thereafter, the Ringlings were able to buy his entire circus for less than $500,000. They ran the two circuses separately until federal restrictions during World War I limited the number of railroad engines they could use. Thinking the war would continue for many years, the Ringlings decided to consolidate the circuses temporarily for the 1919 season to meet federal wartime regulations. The combined show made so much money that the Ringling Bros. and Barnum & Bailey’s Circus became permanent—known as "The Greatest Show on Earth"—until earlier this year, when, after 146 years, it announced it would close. The Smithsonian Folklife Festival celebrates its 50th anniversary this year with an exploration of the life and work of circus people today. "Circus Arts" performances, food and workshops take place on the National Mall in Washington, D.C., June 29 to July 4 and July 6 to July 9. Janet M. Davis teaches American Studies and History at the University of Texas at Austin. She is the author of The Gospel of Kindness: Animal Welfare and the Making of Modern America (2016); The Circus Age: American Culture and Society Under the Big Top (2002); and editor of Circus Queen and Tinker Bell: The Life of Tiny Kline (2008).
0799ed8416e00ae4297c01f4db3158f8
https://www.smithsonianmag.com/history/andrew-jackson-populist-even-deathbed-180962124/
Andrew Jackson Was a Populist Even on His Deathbed
Andrew Jackson Was a Populist Even on His Deathbed Andrew Jackson lay gasping in his bed at home in Tennessee, the lead slugs in his body at long last having their intended effect. It was the spring of 1845 and “Old Hickory”—hero of the War of 1812 and the nation’s seventh president, born 250 years ago, on March 15, 1767—was finally dying after so many things and people had failed to kill him. The 78-year-old was wracked by malarial coughs from his field campaigns against the British, Creeks and Seminoles, and plagued by wounds from two duels, which had left bullets lodged in his lungs and arm. It was so apparent he would soon be buried that a friend offered him a coffin. This was no ordinary box, however. It was a massive and ornate marble sarcophagus. Jackson’s old compatriot Commodore Jesse D. Elliott had purchased it from Beirut while serving as commander of the U.S. naval fleet in the Mediterranean, and brought it back in his flagship the USS Constitution, along with a mummy and a dozen Roman columns. The 71⁄2- by 3-foot sarcophagus, embellished with carved rosettes and cherubs, was thought to have once held the remains of the third-century Roman ruler Alexander Severus. Elliott believed it would be an illustrious vessel for the corpse of the former president. “Containing all that is mortal of the patriot & hero, Andrew Jackson, it will, for a long succession of years, be visited as a hallowed relic,” he predicted. Elliott’s proposal said much about the powerful personality cult surrounding the president and the fanatical worshipfulness of his admirers. It also said something about the size of Jackson’s ego and taste for tribute that Elliott believed he would accept it. This article is a selection from the March issue of Smithsonian magazine Jackson’s reputation as a populist was disputed by his contemporaries. To his admirers he was a supremely gifted leader, to his critics, a self-interested tyrant and power-mad chieftain, whose farewell address was “happily the last humbug which the mischievous popularity of this illiterate, violent, vain and iron-willed soldier can impose upon a confiding and credulous people,” wrote one Whig newspaper. Was Jackson truly, as he called himself, “the immediate representative of the American people”? Or was it “effrontery,” as his alienated vice president, John C. Calhoun, put it, to call himself a champion of the common man? “He certainly believed that he came from the people and exercised power on behalf of the people,” says historian H. W. Brands, author of Andrew Jackson: His Life and Times. “But he wasn’t like most people who voted for him.” He was the sworn enemy of elitism, who bore scars from a sword wound on his head for refusing to polish a British officer’s boots after being captured as a 14-year-old soldier in South Carolina during the American Revolution. Yet he was a remorseless slaveholder who chased gentleman-planter status. He was a merciless remover of Indians yet a tender collector of orphans, who took in a Creek boy, Lyncoya, found next to the child’s dead mother on a battlefield, as well as several nephews. He was a savage swearer of oaths, “a barbarian who could not write a sentence of grammar and hardly could spell his own name,” according to his rival John Quincy Adams. Yet a surprised hostess once found Jackson to be a courtly “prince” in a parlor. He had the humblest beginnings of any president up to that point and despised inherited wealth, yet he was a dandy preoccupied with the cut of his coat and the quality of the racehorses at his plantation, the Hermitage. “Infatuated man!” Calhoun railed against him. “Blinded by ambition—intoxicated by flattery and vanity!” Yet for all that he loved adulation, Jackson declined the sarcophagus. “I cannot consent that my mortal body shall be laid in a repository prepared for an Emperor or a King—my republican feelings and principles forbid it—the simplicity of our system of government forbids it,” he wrote to Elliott. Jackson died a few weeks later, on June 8, 1845. “I wish to be buried in a plain, unostentatious manner,” he instructed his family. He was placed alongside his wife at the Hermitage, without much in the way of ceremony, but with a huge outpouring from the thousands who attended, including his pet parrot, Pol, who had to be removed for squawking her master’s favorite oaths. As for Elliott, he gave the empty sarcophagus to the fledgling Smithsonian. “We cannot but honor the sentiments which have ruled his judgement in this case,” Elliott observed of the president, “for they are such as much add to the lustre of his character.” Editor’s Note, March 22, 2017: This article has been updated to reflect Commodore Jesse D. Elliott’s report that he purchased the sarcophagus in Beirut. In this, the first major single-volume biography of Andrew Jackson in decades, H.W. Brands reshapes our understanding of this fascinating man, and of the Age of Democracy that he ushered in. Sally Jenkins is a sports columnist and feature writer for the Washington Post. Her most recent book is Sum It Up, a biography of NCAA coach Pat Summitt.
b6361e3c8d63d03ce846c91bad435c30
https://www.smithsonianmag.com/history/archaeologists-identify-famed-fort-where-indigenous-tlingits-fought-russian-forces-180976818/
Archaeologists Identify Famed Fort Where Indigenous Tlingits Fought Russian Forces
Archaeologists Identify Famed Fort Where Indigenous Tlingits Fought Russian Forces For thousands of years, the Tlingit people made their home in the islands of Southeast Alaska among other indigenous peoples, including the Haida, but at the turn of the 19th century, they came into contact with a group that would threaten their relationship with the land: Russian traders seeking to establish a footprint on the North American continent. The colonists had been expanding into Alaska for decades, first exploiting Aleut peoples as they chased access to sea otters and fur seals that would turn profits in the lucrative fur trade. The Russian American Company, a trading monopoly granted a charter by Russian tsar Paul I just as British monarchs had done on the continent’s east coast in the 17th century, arrived in Tlingit territory around Sitka in 1799. On the eastern edge of the Bay of Alaska, the settlement was at an ideal location for the company to advance its interests into the continent. Stopping them, however, was resistance from a Tlingit community uninterested in becoming colonial subjects. In an attempt to oust the colonizers, the Kiks.ádi clan launched an attack on a Russian outpost near Sitka called Redoubt Saint Michael in 1802, killing nearly all of the Russians and Aleuts there. The Kiks.ádi clan members were prepared for retaliation after a tribal shaman predicted the Russians, led by Alexander Baranov, would return. The Tlingits built a wooden fort to stave off the foretold attack, which would come in the fall of 1804 when Baranov returned with his forces to demand that they surrender their land. The Kiks.ádi instead prepared for battle. They successfully defended the initial assault from the Russians and Aleuts, but after six days, with supplies dwindling, the clan’s elders decided to withdraw and embark on a survival march north. The Russians quickly established a fortified presence in Sitka, and with that new foothold, they would claim the entirety of Alaska as a colony, which they would later sell to the United States in 1867 for $7 million. Today, Sitka National Historical Park commemorates the site of a battle that changed the course of Alaska’s history, but the precise location of the Tlingit fort has remained unknown until now. More than two centuries later, archaeologists have finally pinpointed the stronghold where native Alaskans resisted colonization through the use of ground-penetrating radar and electromagnetic instruments. The peninsula where the fort was located on what’s now called Baranof Island has long been recognized as a site of historical importance. It was given federal protection by the U.S. government as a monument beginning in 1910. Now a popular tourist attraction—it’s a common destination for the region’s cruise industry—the park has walking paths lined with Tlingit and Haida totem poles. Much of the seaside park is forested, but the U.S. National Park Service had designated a clearing for the probable location for the fort, which was documented and then razed by the Russians. However, there has not been broad agreement on where exactly the fort was, said Cornell research scientist Thomas Urban, lead author of the new study published in Antiquity. “A number of investigations over the years produced some clues but no definitive answer,” Urban says. “Outside of the clearing itself, geophysical surveying is very tedious because most of the peninsula is densely forested.” Urban said he was in Sitka to assist in locating burials at a historic cemetery when the park officials first inquired about resuming the search for the fort site. In the 1950s, archaeologists had dug a few trenches and discovered what they thought were rotting pieces of the fort's palisade wall. The site was revisited by an NPS team again from 2005 to 2008 who found cannon balls and other artifacts associated with the battle inside the clearing commemorating the fort. But those researchers could not confirm that this was indeed the correct location of the fort. In the summer of 2019, Urban and study co-author Brinnen Carter, of the National Park Service, scanned large swathes of the park, including areas with thick vegetation, using new technologies. Geophysical tools allow archaeologists to see buried structures without excavating because different features and materials—for example, bricks, postholes, cannonballs, and loose soil—often have different signatures. The footprint of the underground structures Urban and Carter found match the drawings Russians made of the trapezoidal-shaped fort. At around 300 feet long and 165 feet wide, the perimeter of the fort surrounds the modern clearing. Such forts were not part of traditional Tlingit architecture—most of the other fort sites take advantage of natural rock formations—but the building seems to have been an adaptation to deal with conflict with colonizers, says Thomas Thornton, a dean at the University of Alaska Southeast and a researcher affiliated with the University of Oxford. The name for the fort in Sitka, Shís'gi Noow, means sapling fort in English, which hints at an important innovation: the Tlingits learned that more flexible new-growth timbers would better absorb the shock of Russian cannonballs. The fort “represents a watershed event in Alaska's history,” says Thornton, who wasn’t involved in the new research but has studied Tlingit history and collaborated on research with the Sitka Tribe. “If we better understand it through archaeology, through oral history, the better off we will be in terms of being informed about this history, which is still quite present in the architecture of Sitka and the relationships that you find in Sitka.” The oral histories collected by Tlingit members of the Sitka tribe investigated unsettled aspects of the conflict. Through arduous trial and error, Kiks.adi elder Herb Hope spent years in the late 1980s and 1990s trying to retrace the steps of his ancestors’ survival march to determine their route. (He came to the conclusion that they likely took a coastal path.) Hope once said that he was inspired to undertake the project after he saw Kiks.ádi clan members apologizing for their part in the 1804 war. He wanted to dismantle the notion that the Kiks.ádi retreated. "It was a survival march through our own backyard to a planned location," he told a Tlingit conference in 1993. "However the Russians may have viewed the battle at that time and however history may view that battle today, at that time the Battle of Sitka of 1804 clearly showed the rest of the world that the Russian forces in Alaska were too weak to conquer the Tlingit people." Oral histories suggest that up to 900 people took part in that march. The Kiks.ádi moved from campsite to campsite north along the craggy landscape of Baranof Island to reach Point Craven on the neighboring Chichagof Island, where they took over an abandoned fort called Chaatlk'aanoow. From that post they were able to hurt Russian trade by establishing a blockade of Sitka Sound. According to Hope's account: "The blockade became even more effective once the Yankee traders learned of the blockade and sought to exploit it. They set up a trading station across from Chootlk'aanoow on Catherine Island to the south. Even to this day it bears the name 'Traders Bay.' Trader canoes from all over the northern end of Southeast Alaska came to trade with the Yankee traders at Traders Bay." The Tlingit people returned to Sitka in 1821, but would never again have sovereign control of the island. The NPS and Urban currently have no further plans to investigate the fort site, but its identification offers a clearer picture of a short-lived but hugely significant building. And for Urban, the identification of the fort also shows the potential for geophysical investigations in Alaska, which he says have recently been used to find burial sites, the ruins of houses, mammoth bones, and even 10,000 year old campfires. Megan Gannon is a science journalist who often writes about archaeology and space. She was previously a news editor at Live Science and Space.com.
8ca4b57feb31225a0b28eef2c6d36950
https://www.smithsonianmag.com/history/archaeologoists-only-just-beginning-reveal-secrets-hidden-ancient-manuscripts-180967455/
Archaeologists Are Only Just Beginning to Reveal the Secrets Hidden in These Ancient Manuscripts
Archaeologists Are Only Just Beginning to Reveal the Secrets Hidden in These Ancient Manuscripts Last summer, Giulia Rossetto, a specialist in ancient texts at the University of Vienna, was on a train home to Pordenone, in northern Italy, when she switched on her laptop and opened a series of photographs of a manuscript known as “Arabic New Finds 66.” It is no ordinary manuscript. In antiquity, it was common practice when parchment supplies were limited to scrape the ink from old manuscripts, with chemicals or pumice stones, and reuse them. The resulting double-text is called a palimpsest, and the manuscript Rossetto was studying contained several pages whose Christian text, a collection of saints’ lives written in tenth-century Arabic, hid a much older text beneath, in faintest Greek. Nothing was known about what this “undertext” contained. Rossetto, a PhD student, was given the images as an afterthought, when an older scholar complained that reading them was beyond his failing eyesight. But these were no ordinary photographs, either. They were taken using a state-of-the-art technique known as multispectral imaging, or MSI, in which each page of a text is photographed many times while illuminated by different colors and wavelengths of light, and then analyzed using computer algorithms to find a combination that most clearly distinguishes the two layers of text. As Rossetto’s train sped through the Austrian Alps, she flipped between the images, adjusting the contrast, brightness and hue to minimize the appearance of the Arabic overtext while picking out tiny Greek letters, each around three millimeters tall. The style of the script suggested that it was probably written in Egypt in the fifth or sixth century, and Rossetto expected another Christian text. Instead, she began to see names from mythology: Persephone, Zeus, Dionysus. The lost writing was classical Greek. There was no internet connection on the train. But as soon as she got home, Rossetto rushed to her computer to check her transcription against known classical texts. “I tried different combinations, and there was nothing,” she recalls. “I thought, ‘Wow, this is something new.’” In his poem “Endymion,” based on a Greek myth about a shepherd beloved by the moon goddess Selene, John Keats paid tribute to the enduring power of superior works of art. “A thing of beauty is a joy for ever,” he wrote. “Its loveliness increases; it will never / Pass into nothingness.” Surely to uncover lost poetry from an ancient civilization from which we draw so many of our literary traditions is as exciting as unearthing any material treasure. And this promise reaches beyond aesthetics. When classical Greek literature was rediscovered during the European Renaissance, it remade Western civilization, and planted seeds that still shape our lives today: Thomas Jefferson’s ideas about the pursuit of happiness were sparked by the Greek philosophers; suffragists were inspired by Euripides’ heroine Medea. Like finding an old photograph of a long-dead relative, discovering a lost piece of text can help us glimpse ourselves in the people who came before us. Rossetto’s text is just one of hundreds whose recovery was recently announced by researchers participating in a project to decipher the secrets of a unique treasury. In the Sinai Desert, in Egypt, a monastery called St. Catherine’s hosts the world’s oldest continually operating library, used by monks since the fourth century. In addition to printed books, the library contains more than 3,000 manuscripts, accumulated over the centuries and remarkably well preserved by the dry and stable climate. The monks at St. Catherine’s were particularly fond of reusing older parchment for their religious texts. Today the library holds at least 160 palimpsests—likely the largest collection in the world. But the ancient scribes did their job frustratingly well. In most cases, the texts underneath were hidden and, until now, thought lost. ********** St. Catherine’s, a community of 25 or so Greek Orthodox monks at the foot of Mount Sinai, transcends history, in that ancient traditions live on into the present day. The first mention of its written collection comes from an account by a fourth-century pilgrim named Egeria, who described how the monks read biblical passages to her when she visited a chapel built to commemorate Moses’ burning bush. In the sixth century, the Byzantine emperor Justinian protected that chapel with hefty granite walls. Fifteen hundred years later, they stand intact. As you approach it, the sand-colored monastery, nestled low on the mountain, looks humble and timeless, like something made of the desert. Inside is a warren of stone steps, arches and alleyways; a square bell tower draws the eye upward toward the jagged mountain peaks above. Despite the rise and fall of surrounding civilizations, life here has changed remarkably little. The monks’ first daily worship still begins at 4 a.m. Central to St. Catherine’s, now as in Egeria’s time, is the library, and the person in charge of it is the Rev. Justin Sinaites, who wears a long, gray beard and the black robes traditional to his faith. Born in Texas and brought up Protestant, Father Justin, as he prefers to be known, discovered Greek Orthodoxy while studying Byzantine history at the University of Texas at Austin. After converting to the faith, he spent more than 20 years living at a monastery in Massachusetts, where, as head of the monastery’s publications, he became adept at using computer and desktop publishing technology. In 1996, Father Justin moved to St. Catherine’s, and when the monastery’s abbot decided to digitize the library’s manuscript collection to make it available to scholars around the world, Father Justin was asked to lead the effort. When I reached Father Justin in Egypt by telephone this fall, he was thoughtful and articulate, and gave the impression, like the monastery itself, of existing on a plane outside of worldly limitations. Asked to describe the physical size of the library, he at first seemed baffled. “I don’t think in those terms,” he said. During our conversation, he routinely answered my questions with stories rooted hundreds of years in the past. Because the librarian alone was allowed to access the library vaults, the manuscripts were always brought to him one by one, their darkened edges and drops of candle wax testament to centuries of wear and use. “I was so eager to go in and see everything else, and I couldn’t,” he says. Then, about ten years ago, “they made me the librarian.” Finally he could explore the full collection, including the palimpsests. The problem was that there didn’t seem much hope of reading them. But in 2008, researchers in the United States announced the completion of a ten-year project to use multispectral imaging to read lost works by the Greek mathematician Archimedes hidden beneath the liturgy of a 13th-century Byzantine prayer book. Father Justin, who already knew members of the group, asked if they would come to St. Catherine’s. The resulting collaboration, known as the Sinai Palimpsests Project, is directed by Michael Phelps of the California-based Early Manuscripts Electronic Library, a nonprofit research group that works with universities such as UCLA and other institutions to digitize historical source materials and make them accessible for study. Beginning in 2011, Phelps and other members of the project made 15 visits to the monastery over five years, each time driving for hours through the Sinai Desert, the site of ongoing conflict between Egyptian security forces and Islamic militants. Many of the palimpsests come from a cache of about 1,100 manuscripts found in a tower of the monastery’s north wall in 1975, and consist of damaged leaves left behind when the library was moved in the 18th century, then hidden for protection after an earthquake. They are tinder dry, falling to pieces and often nibbled by rats. Father Justin brought each palimpsest out in turn to be photographed by the project’s chief camera operator, Damianos Kasotakis, who used a 50-megapixel camera custom-built in California. Photographing each page took about seven minutes, the shutter clicking repeatedly while the page was illuminated by infrared, visible and ultraviolet lights that ran across the color spectrum. The researchers toyed with different filters, lighting from strange angles, anything they could think of that might help pick out details from a page’s surface. Then a group of imaging specialists based in the United States “stacked” the images for each page to create a “digital cube,” and designed algorithms, some based on satellite imaging technology, that would most clearly recognize and enhance the letters beneath the overtext. “You just throw everything you can think of at it,” Kasotakis says, “and pray for the best.” ********** Perhaps someone was listening. Late last month, the monastery and the Early Manuscripts Electronic Library announced at a conference in Athens that over the five-year period they had imaged 6,800 pages from 74 palimpsests, which will be made accessible online by UCLA sometime in early 2018. So far, their work has revealed more than 284 erased texts in ten languages, including classical, Christian and Jewish texts dating from the fifth century until the 12th century. The collection is being compared to the greatest manuscript discoveries of the 20th century, including the Nag Hammadi codices of Egypt and the Dead Sea Scrolls. Already, as a part of the Sinai Palimpsests Project, some two dozen scholars from across Europe, the United States and the Middle East are poring over these texts. One of the most exciting finds is a palimpsest made up of scraps from at least ten older books. The manuscript is a significant text in its own right: the earliest known version of the Christian Gospels in Arabic, dating from the eighth or ninth century. But what’s underneath, Phelps predicts, will make it a “celebrity manuscript”—several previously unknown medical texts, dating to the fifth or sixth century, including drug recipes, instructions for surgical procedures (including how to remove a tumor), and references to other tracts that may provide clues about the foundations of ancient medicine. Another fragment of this palimpsest contains a beautiful two-page illustration of a flowering plant—from an “herbal,” or guide to medicinal plants, which Nigel Wilson, a classicist at Oxford who is studying the text, believes may be a work by Crateuas, physician to the poison-obsessed Anatolian king Mithradates in the first century B.C. Copies of his drawings made as late as 600 years after his death survive, but until now we only knew his writings through quotations by the first-century physician Dioscorides. “This is the first scrap we’ve got of an actual manuscript of his work,” says Wilson. From the same palimpsest Agamemnon Tselikas, director of the Center for History and Palaeography in Athens, recovered the earliest known versions of classic texts by Hippocrates, the father of Western medicine, which are four centuries older than any copies previously known. Other fragments include bits as unexpected as a version of an ancient Greek adventure story called Apollonius of Tyre, which is now the oldest known Latin translation and the earliest with illustrations by 500 years. Giulia Rossetto, who discovered her own celebrity manuscript aboard a train ride home to Italy, is still piecing together the implications of her find. So far she has deciphered 89 lines of text (many of them incomplete) and learned that they belong to a previously unknown poem written in Greek hexameter—the same scheme used for Homer’s epics. They tell of a myth in which Dionysus, the young son of Zeus and Persephone, sits on a throne as a group of murderous Titans tries to win his confidence. Rossetto also found the number 23 in the text, which she believes denotes a book number, hinting, she says, that the lines might come from the Rhapsodies, attributed by the ancients to the mythical poet Orpheus and collected in 24 books, like Homer’s poems. The Rhapsodies were widely studied until at least the sixth century, but are today known only through fragmentary quotations by later philosophers. Now Rossetto has found what may be lines from the Rhapsodies themselves. The discovery, says Claudia Rapp, a professor of Byzantine studies at the University of Vienna and Rossetto’s supervisor, is the kind of thing that appears perhaps once in a generation. “The dream of everybody working with palimpsest material is to find previously unknown bits of classical texts from pagan antiquity.” ********** The secrets of each individual manuscript will keep scholars busy for years to come. Yet there’s an even bigger discovery emerging from this project, beyond the many textual revelations: the surprising history of St. Catherine’s itself. Rapp, who also serves as the Sinai project’s scholarly director, has been especially intrigued to learn what the palimpsests reveal about the process by which parchments were reused. In none of them is there an apparent relationship between the overtext and undertext, she says. Indeed, scattered pages from multiple older manuscripts, in different languages, were often brought together to make a new book. Rather than individual scribes selecting manuscripts to scrape clean for personal use, this suggests an organized production, perhaps even commercial circulation, of recycled parchment sheets. And the sheer variety of languages uncovered was entirely unexpected. Some of the texts are even helping to reconstruct lost languages, including Caucasian Albanian, spoken in an ancient kingdom in present-day Azerbaijan, and Christian Palestinian Aramaic, used by Christians in Palestine until the 13th century. Researchers also discovered several Greek texts translated into Syriac, which was first spoken by Syrian Christians before becoming a major literary language throughout the Middle East. We already know that in the eighth and ninth centuries, the Islamic caliphate, then based in Baghdad, sponsored a huge program to translate Greek classical knowledge through Syriac into Arabic (a project that helped save much of classical Western knowledge during the Dark Ages). These Syriac undertexts show that Christian scholars at St. Catherine’s were a part of this effort. “We can see this great translation movement in process,” Phelps says. Each surprise adds a piece to the puzzle. The discovery of two unknown Christian texts in the ancient language of Ge’ez suggests that Ethiopian monks, who were not thought to have had much contact with Sinai in antiquity, may once have practiced at the monastery. And one palimpsest, which Michelle Brown, a former curator at the British Library in London, describes as a “Sinai sandwich,” is remarkable for the relationship it suggests between four different layers of text. Its oldest layer was written in Greek, at St. Catherine’s. Next is an undertext in a Latin script used in Italy at the turn of the seventh century, then an eighth-century Latin insular script, a style of writing pioneered by monks in Ireland that flourished in the British Isles. The top layer is an Arabic script written at St. Catherine’s around the turn of the tenth century. This is a real breakthrough—a “smoking gun,” Brown says. Scholars have assumed that there was little contact between the Middle East and the West in the Middle Ages, before the Crusades, but Brown suspected from what she could already make out of the palimpsest and other fragments at St. Catherine’s that this view was wrong. The layering of these scripts revealed by the new imaging supports her hunch. It’s exceedingly unlikely that the pages were carried from Sinai to Rome, to Britain, and then back again. Instead, she says, monks from these distinct Western communities must have been working at St. Catherine’s over the centuries. Put all of that together, and our view of this humble outpost is transformed. We might think of the Sinai Desert merely as a remote wilderness where the Jews wandered for decades after their escape from Egyptian slavery. But the diverse findings of the palimpsests project offer stunning testimony to St. Catherine’s role as a vibrant cosmopolitan center and a key player in the cultural history of East and West, where people of different languages and communities met and exchanged practices and intellectual traditions. “It is a place where people made the effort to travel to,” says Rapp. “And they came from all over the world.” ********** For Father Justin, the project represents a remarkable opportunity to extend what he calls a “living tradition” at St. Catherine’s, in which each manuscript is not only a holy object but a tangible witness to visitors from the remote past. For centuries, the monastery’s walls protected these manuscripts, but the political situation outside remains turbulent; last spring, militants allied with ISIS killed a policeman a few hundred yards from its gates. Although Father Justin insists this danger isn’t representative, he hopes the imaging project will help to protect the manuscripts’ treasures for centuries to come: “That is our obligation and our challenge today.” This article is a selection from the January/February issue of Smithsonian magazine Jo Marchant is an award-winning science journalist and former editor at New Scientist and Nature. She is the author of <"https://amzn.to/3f6i9rW">The Human Cosmos: Civilization and the Stars
dda616534182c0a8ae5c544a655a6a21
https://www.smithsonianmag.com/history/assignment-afghanistan-85698509/
Assignment Afghanistan
Assignment Afghanistan As my eyes adjusted to the dark and gloomy schoolroom, I could see the men more clearly, their woolen shawls drawn up against their tough and leathery faces. They were farmers and herders who lived hard lives on meager land, survivors of foreign occupation and civil war, products of a traditional society governed by unwritten rules of religion and culture and tribe where Western concepts like freedom and happiness were seldom invoked. But there was something I had not seen before in the faces of these turbaned villagers; an almost childish excitement, a look both nervous and dignified: a feeling of hope. It was October 9, 2004, and they were among 10.5 million voters who had registered to elect the first president in their country’s history. No one shoved or jostled as the line inched toward a pair of scarred school benches, where two elderly officials were checking off ledgers, marking thumbs with purple ink, murmuring instructions: “There are 18 candidates for president, here are their names and pictures, mark the one you want, but only one.” Then they handed each man a folded paper and motioned him politely toward a flimsy metal stand curtained with a red gingham cloth. I positioned myself behind one of the benches. I wanted to remember this day, this hushed and universal ritual of a fledgling democracy that once had seemed impossible to imagine. In another week, I would be leaving the country after nearly three years that had been among the most exhilarating, as well as the most grueling, of my career as a foreign correspondent. During that time I had covered the assassinations of two cabinet ministers, tiptoed through the human wreckage of car bombings, chronicled the rapid spread of opium poppy cultivation, witnessed the release of haggard war prisoners and the disarming of ragged militiamen. But I had also traveled with eager refugees returning home from years in exile, visited tent schools in remote villages and computer classes in makeshift storefronts, helped vaccinate flocks of sheep and goats, watched parched and abandoned fields come alive again, and reveled in the glorious cacophony of a capital city plugging into the modern world after a quarter-century of isolation and conflict. Even on days when I awoke feeling as if there were little hope for the country and less I could do to help, invariably something occurred that restored my faith. Someone made a kind gesture that dissipated the poison around me, told me a tale of past suffering that put the day’s petty grievances in new perspective, or expressed such simple longing for a decent, peaceful life that it renewed my determination to make such voices heard above the sniping and scheming of the post-Taliban era. On this particular day, it was the look on a young farmer’s face as he waited to vote in a chilly village schoolroom. He was a sunburned man of perhaps 25. (Once I would have said 40, but I had learned long ago that wind and sand and hardship made most Afghans look far more wizened than their years.) He was not old enough to remember a time when his country was at peace, not worldly enough to know what an election was, not literate enough to read the names on the ballot. But like everyone else in the room, he knew this was an important moment for his country and that he, a man without education or power or wealth, had the right to participate in it. The farmer took the ballot gingerly in his hands, gazing down on the document as if it were a precious flower, or perhaps a mysterious amulet. I raised my camera and clicked a picture I knew I would cherish for years to come. The young man glanced up at me, smiling shyly, and stepped behind the gingham curtain to cast the first vote of his life. I first visited Afghanistan in 1998, a dark and frightened time in a country that was exhausted by war, ruled by religious zealots and shut off from the world. Kabul was empty and silent, except for the squeak of carts and bicycles. Entire districts lay in ruins. Music and television had been banned, and there were no women on the streets except beggars hidden beneath patched veils. For a Western journalist, the conditions were hostile and forbidding. I was not allowed to enter private homes, speak to women, travel without a government guide or sleep anywhere except the official hotel—a threadbare castle where hot water was delivered to my room in buckets and an armed guard dozed all night outside my door. Even carefully swathed in baggy shirts and scarves, I drew disapproving stares from turbaned gunmen. Interviews with Taliban officials were awkward ordeals; most recoiled from shaking my hand and answered questions with lectures on Western moral decadence. I had few chances to meet ordinary Afghans, though I made the most of brief comments or gestures from those I encountered: the taxi driver showing me his illegal cassettes of Indian pop tunes; the clinic patient pointing angrily at her stifling burqa as she swept it off her sweat-soaked hair. I visited Afghanistan that first time for three weeks and then nine more times during Taliban rule. Each time the populace seemed more desperate and the regime more entrenched. On my last trip, in the spring of 2001, I reported on the destruction of two world-renowned Buddha statues carved in the cliffs of Bamiyan, and I watched in horror while police beat back mobs of women and children in chaotic bread lines. Exhausted from the stress, I was relieved when my visa expired and headed straight for the Pakistan border. When I reached my hotel in Islamabad, I stripped off my dusty garments, stood in a steaming shower, gulped down a bottle of wine and fell soundly asleep. The first sprigs of green were poking up from the parched winter fields of the Shomali Plain stretching north from Kabul. Here and there, men were digging at dried grapevine stumps or pulling up buckets of mud from longclogged irrigation canals. Bright blue tents peeked out from behind ruined mud walls. New white marking stones had been neatly placed on long-abandoned graves. Along the highway heading south to Kabul, masked workers knelt on the ground and inched forward with trowels and metal detectors, clearing fields and vineyards of land mines. It had been a year since my last visit. From the terrible ashes of the World Trade Center had risen Afghanistan’s deliverance. The Taliban had been forced into flight by American bombers and Afghan opposition troops, and the country had been reinvented as an international experiment in postwar modernization. Within a month of the Taliban’s defeat, Afghanistan had acquired a dapper interim leader named Hamid Karzai, a tenuous coalition government, pledges of $450 million from foreign donors, a force of international peacekeepers in Kabul, and a blueprint for gradual democratic rule that was to be guided and financed by the United Nations and the Western powers. For 35 months—from November 2001 to October 2004—I would now have the extraordinary privilege of witnessing Afghanistan’s rebirth. This was a journalist’s dream: to record a period of liberation and upheaval in an exotic corner of the world, but without having to be afraid anymore. As on my trips during the Taliban era, I still wore modest garments (usually a long-sleeved tunic over baggy trousers) in deference to Afghan culture, but I was free to stroll along the street without worrying I would be arrested if my head scarf slipped, and I could photograph markets and mosques without hastily hiding my camera under my jacket. Best of all, I could chat with women I encountered and accept invitations to tea in families’ homes, where people poured out astonishing tales of hardship and flight, abuse and destruction—none of which they had ever shared with a stranger, let alone imagined seeing in print. Just as dramatic were the stories of returning refugees, who poured back into the country from Pakistan and Iran. Day after day, dozens of cargo trucks rumbled into the capital with extended families perched atop loads of mattresses, kettles, carpets and birdcages. Many people had neither jobs nor homes awaiting them after years abroad, but they were full of energy and hope. By late 2003, the United Nations High Commission for Refugees had registered more than three million returning Afghans at its highway welcome centers. I followed one family back to their village in the Shomali Plain, passing rusted carcasses of Soviet tanks, charred fields torched by Taliban troops, and clusters of collapsed mud walls with a new plastic window here or a string of laundry there. At the end of a sandy lane, we stopped in front of one lifeless ruin. “Here we are!” the father exclaimed excitedly. As the family started unloading their belongings, the long-absent farmer inspected his ruined vineyards—then graciously invited me back to taste his grapes after the next harvest. Another wintry day I drove up into the Hindu Kush mountains, where the main highway tunnel to the north had been bombed shut years before and then lost beneath a mountain of ice. I will never forget the scene that met my eyes through the swirling snow: a long line of families, carrying children and suitcases and bundles toward the tunnel, edging down narrow steps and vanishing inside the pitchblack passageway cut through the ice. I tried to follow, but my hands and my camera froze instantly. An arctic wind howled through the darkness. As I emerged from the tunnel, I brushed past a man with a little girl on his back, her naked feet purple with cold. “We have to get home,” he muttered. Ahead of them was a two-hour trek through hell. The rapidly filling capital also sprang back to life, acquiring new vices and hazards in the process. Bombed buildings sprouted new doors and windows, carpenters hammered and sawed in sidewalk workshops, the air was filled with a clamor of construction and honking horns and radios screeching Hindi film tunes. Traffic clogged the streets, and policemen with whistles and wooden “stop” paddles flailed uselessly at the tide of rusty taxis, overcrowded buses and powerful, dark-windowed Landcruisers—the status symbol of the moment—that hurtled along narrow lanes as children and dogs fled from their path. Every time I sat fuming in traffic jams, I tried to remind myself that this busy anarchy was the price of progress and far preferable to the ghostly silence of Taliban rule. As commerce and construction boomed, Kabul became a city of scams. Unscrupulous Afghans set up “nonprofit” agencies as a way to siphon aid money and circumvent building fees. Bazaars sold U.N. emergency blankets and plasticpouched U.S. Army rations. Landlords evicted their Afghan tenants, slapped on some paint and re-rented their houses to foreign agencies at ten times the previous rent. But hard-working survivors also thrived in the competitive new era. During the Taliban years, I used to buy my basic supplies (scratchy Chinese toilet paper, laundry detergent from Pakistan) from a glum man named Asad Chelsi who ran a tiny, dusty grocery store. By the time I left, he had built a gleaming supermarket, filled with foreign aid workers and affluent Afghan customers. The shelves displayed French cheese, German cutlery and American pet food. Aborn entrepreneur, Asad now greeted everyone like an old friend and repeated his cheerful mantra: “If I don’t have what you want now, I can get it for you tomorrow.” The sound of the bomb was a soft, distant thud, but I knew it was a powerful one and steeled myself for the scene I knew I would find. It was midafternoon on a Thursday, the busiest shopping time of the week, and the sidewalk bazaars were crowded. The terrorists had been clever: first a small package on a bicycle exploded, drawing a curious crowd. Several moments later, a far larger bomb detonated in a parked taxi, shattering shop windows, engulfing cars in flames and hurling bodies in the air. Firemen were hosing blood and bits of glass off the street and sirens wailed. Fruits and cigarettes lay crushed; a boy who sold them on the sidewalk had been taken away, dead. As my colleagues and I rushed back to our offices to write our reports, news of a second attack reached us: a gunman had approached President Karzai’s car in the southern city of Kandahar and fired through the window, narrowly missing him before being shot dead by American bodyguards. Karzai appeared on TV several hours later, wearing a confident grin and dismissing the attack as an occupational hazard, but he must have been at least as shaken as the rest of us. The list of those with motive and means to subvert the emerging order was long, but like the taxi bomb that killed 30 people on that September day in 2002, most terrorist crimes were never solved. In many parts of the country, militia commanders commonly known as warlords maintained a tight grip on power, running rackets and imposing their political will with impunity. People feared and loathed the warlords, pleading with the government and its foreign allies to disarm them. But the gunmen, with little respect for central authority and many skeletons left over from the rapacious civil-war era of the early 1990s, openly defied the disarmament program that was a key element of the U.N.-backed plan for transition to civilian rule. Karzai’s own tenuous coalition government in Kabul was rent by constant disputes among rival factions. The most powerful were a group of former commanders from the northern PanjshirValley, ethnic Tajiks who controlled thousands of armed men and weapons and who viewed themselves as the true liberators of Afghanistan from Soviet occupation and Taliban dictatorship. Although formally part of the government, they distrusted Karzai and used their official fiefdoms in the state security and defense apparatus to wield enormous power over ordinary citizens. Karzai was an ethnic Pashtun from the south who controlled no army and exercised little real power. His detractors derided him as the “mayor of Kabul” and an American puppet, and after the assassination attempt he became a virtual prisoner in his palace, protected by a squad of American paramilitary commandos sent by the Bush administration. I observed Karzai closely for three years, and I never saw him crack. In public, he was charming and cheerful under impossible circumstances, striding into press conferences with a casual, self-confident air and making solemn vows for reforms he knew he could not possibly deliver. In interviews, he was effortlessly cordial and relentlessly upbeat, though I always sensed the barely concealed frustration of a leader in a straitjacket. Everyone, perhaps no one more than the president, knew that without American B-52 bombers leaving streaks across the sky at crucial moments, the Afghan democratic experiment could collapse. Instead the country lurched, more or less according to plan, from one flawed but symbolic political milestone to the next. First came the emergency Loya Jerga of June 2002, an assembly of leaders from across the country that rubberstamped Karzai as president but also opened the doors to serious political debate. Then came the constitutional assembly of December 2003, which almost collapsed over such volatile issues as whether the national anthem should be sung in Pashto or Dari—but which ultimately produced a charter that embraced both modern international norms and conservative Afghan tradition. The challenge that occupied the full first half of 2004 was how to register some ten million eligible voters in a country with poor roads, few phones, low literacy rates and strong rural taboos against allowing women to participate in public life. After a quarter-century of strife and oppression, Afghans were eager to vote for their leaders, but many feared retaliation from militia commanders and opposed any political procedure that would bring their wives and sisters into contact with strange men. There was also the problem of the Taliban. By 2003, the fundamentalist Islamic militia had quietly regrouped and rearmed along the Pakistan border. They began sending out messages, warning all foreign infidels to leave. Operating in small, fast motorbike squads, they kidnapped Turkish and Indian workers on the new Kabul to Kandahar highway, ambushed and shot a team of Afghan well-diggers, and then executed Bettina Goislard, a young French woman who worked for the U.N. refugee agency. Once voter registration began, the Taliban shifted targets, attacking and killing half a dozen Afghan registration workers. But the extremists miscalculated badly. Afghans were determined to vote, and even in the conservative Pashtun belt of the southeast, tribal elders cooperated with U.N. teams to find culturally acceptable ways for women to cast their ballots. One June day, driving through the hills of KhostProvince in search of registration stories, I came upon a highway gas station with a line of men outside, waiting to have their voter ID photos taken. When I asked politely about the arrangements for women, I was led to a farmhouse filled with giggling women. None could read or write, but a high-school girl filled out each voting card, guessing at their ages, and an elderly man carried them to the gas station. “We want our women to vote, so we have made this special arrangement,” a village leader explained to me proudly. “If they cross the road and some strange driver sees them, people would talk.” Ballrooms twinkled with fairy lights, amplified music pulsed and pounded, young women in slinky sequined dresses twirled across the floor. Kabul was in a post-Taliban wedding frenzy; a society re-knitting itself and reestablishing its rituals after years of repression and flight. Ornate salons were booked around the clock, and beauty parlors were crammed with brides being made up like geishas. But despite the go-go glitter, each wedding—like everything related to romance and marriage—was conducted by traditional Afghan rules. Salons were divided by walls or curtains into separate women’s and men’s sections. The newlyweds were virtual strangers, their match arranged between families and their courtship limited to tightly chaperoned visits. After the ceremony, the bride was expected to move in with her husband’s family, for life. By religious law, he could divorce her at will, or marry up to three additional women. She had almost no rights at all. Even if she were abused or abandoned, it was considered a deep family shame if she sought a divorce, and a judge would admonish her to be more dutiful and reconcile. On some levels, the departure of the Taliban brought new freedom and opportunity to women. Teachers and secretaries and hairdressers could return to work, girls could enroll in school again, and housewives could shop unveiled without risk of a beating from the religious police. In cities, fashionable women began wearing loose but smart black outfits with chic pumps. Women served as delegates to both Loya Jerga assemblies, the new constitution set aside parliamentary seats for women, and a female pediatrician in Kabul announced her candidacy for president. But when it came to personal and sexual matters, political emancipation had no impact on a conservative Muslim society, where even educated urban girls did not expect to date or choose their mates. In Kabul, I became close friends with three women—a doctor, a teacher and a nurse—all articulate professionals who earned a good portion of their families’ income. Over three years, I knew them first as single, then engaged and finally married to grooms chosen by their families. My three friends, chatty and opinionated about politics, were far too shy and embarrassed to talk with me about sex and marriage. When I delicately tried to ask how they felt about having someone else choose their spouse, or if they had any questions about their wedding night—I was 100 percent certain none had ever kissed a man—they blushed and shook their heads. “I don’t want to choose. That is not our tradition,” the nurse told me firmly. Village life was even more impervious to change, with women rarely allowed to leave their family compounds. Many communities forced girls to leave school once they reached puberty, after which all contact with unrelated males was prohibited. During one visit to a village in the Shomali Plain, I met a woman with two daughters who had spent the Taliban years as refugees in Pakistan and recently moved home. The older girl, a bright 14-year-old, had completed sixth grade in Kabul, but now her world had shrunk to a farmyard with chickens to feed. I asked her if she missed class, and she nodded miserably. “If we left her in school, it would bring shame on us,” the mother said with a sigh. For a western woman like me, life in Kabul grew increasingly comfortable. As the number of foreigners increased, I drew fewer stares and began to wear jeans with my blousy tunics. There were invitations to diplomatic and social functions, and for the first time since the end of Communist rule in 1992, liquor became easily available. Yet despite the more relaxed atmosphere, Kabul was still no place for the pampered or faint of heart. My house was in an affluent district, but often there was no hot water, and sometimes no water at all; I took countless bucket baths on shivering mornings with tepid water from the city tap. Urban dust entered every crack, covered every surface with a fine gritty layer, turned my hair to straw and my skin to parchment. Just outside my door was a fetid obstacle course of drainage ditches and rarely collected garbage, which made walking a hazard and jogging out of the question. Electricity was weak and erratic, although the municipal authorities set up a rationing system so residents could plan ahead; I regularly set my alarm for 5 a.m. so I could wash clothes before the 6 a.m. power cut. I became so accustomed to dim light that when I finally returned to the United States, I was shocked by how bright the rooms seemed. For all the stories I covered and the friends I made, what gave real meaning and purpose to my years in Kabul was something else entirely. I had always been an animal lover, and the city was full of emaciated, sickly stray dogs and cats. One by one they found their way into my house, and within a year it was functioning as a shelter. There were no small animal veterinary services—indeed, no culture of pets, unless one counted fighting dogs and roosters—so I treated the animals with pharmacy drugs and patient observation, and almost all of them bounced back. Mr. Stumpy, a mangy cat whose hind leg had been crushed by a taxi and then amputated, hopped around the sun porch. Pak, a sturdy pup whose mother had been poisoned to death, buried bones in my backyard. Pshak Nau, a wild cat who lived over the garage, was gradually lured by canned tuna into domesticity. Honey, a pretty dog I bought for $10 from a man who was strangling her, refused to leave my side for days. Se Pai, a black kitten who was scavenging garbage on three legs, became a contented parlor cat after a terrible wound on his fourth leg healed. One freezing night I found a dog so starved she could no longer walk, and I had to carry her home. I had no space left by then, but an Afghan acquaintance, an eccentric mathematician named Siddiq Afghan, said she was welcome to stay in his yard if she could reach accommodation with his flock of sheep. For an entire winter, I brought Dosty food twice a day, while she eyed the sheep and put on weight. My happiest hours in Afghanistan were spent nursing these animals back to health, and my proudest accomplishment was opening a real animal shelter in a run-down house, which I refurbished and stocked and staffed so it would continue after I left. I also brought some of the animals back with me to America, a complicated and expensive ordeal in itself. Mr. Stumpy landed on a farm in Vermont, where his new owners soon sent me a photograph of an unrecognizably sleek, white creature. Dosty found a permanent home with a couple in Maryland, where she was last reported leaping halfway up oak trees to protect my friends from marauding squirrels. Pak, at this writing, is gnawing on an enormous bone in my backyard in Virginia. Although I grew attached to Kabul, it was in the countryside that I experienced true generosity from people who had survived drought and war, hunger and disease. On a dozen trips, I forced myself to swallow greasy stews offered around a common pot—with bread serving as the only utensil—by families who could ill-afford an extra guest. And in remote villages, I met teachers who had neither chalk nor chairs nor texts, but who had devised ingenious ways to impart knowledge. Over three years, I ventured into perhaps 20 provinces, usually in hasty pursuit of bad news. In Baghlan, where an earthquake toppled an entire village, I listened with my eyes closed to the sounds of a man digging and a woman wailing. In Oruzgan, where a U.S. gunship mistakenly bombed a wedding party, killing several dozen women and children, I contemplated a jumble of small plastic sandals left unclaimed at the entrance. In Logar, a weeping teacher showed me a two room schoolhouse for girls that had been torched at midnight. In Paktia, a dignified policeman twisted himself into a pretzel to show me how he had been abused in U.S. military custody. During a trip to Nangarhar in the eastern part of the country, I was invited on a rollicking and uplifting adventure: a three-day field mission with U.S. military doctors and veterinarians. We straddled sheep to squirt deworming goo in their mouths, watched baby goats being born, and held stepladders so the vets could climb up to examine camels. We also glimpsed the brutal lives of Afghan nomads, who lived in filthy tents and traveled ancient grazing routes. A crippled girl was brought to us on a donkey for treatment; children were given the first toothbrushes they had ever seen; mothers asked for advice on how to stop having so many babies. By the time we were finished, hundreds of people were a little healthier and 10,000 animals had been vaccinated. I also made numerous trips to poppy-growing areas, where the pretty but noxious crop, once nearly wiped out by the Taliban, made such a vigorous comeback that by late 2003 it accounted for more than half of Afghanistan’s gross domestic product and yielded as much as 75 percent of the world’s heroin. Drug trafficking began to spread as well, and U.N. experts warned that Afghanistan was in danger of becoming a “narco-state” like Colombia. Along roads in Nangarhar and Helmand provinces, fields of emerald poppy shoots stretched in both directions. Children squatted busily along the rows, weeding the precious crop with small scythes. Village leaders showed me their hidden stores of poppy seeds, and illiterate farmers, sweating behind ox teams, paused to explain precisely why it made economic sense for them to plow under their wheat fields for a narcotic crop. In March 2004, visiting a village in Helmand, I stopped to photograph a poppy field in scarlet blossom. Asmall girl in a bright blue dress ran up to my driver, beseeching him to appeal to me: “Please don’t destroy our poppies,” she said to him. “My uncle is getting married next month.” She could not have been older than 8, but she already knew that her family’s economic future—even its ability to pay for a wedding—depended on a crop that foreigners like me wanted to take away. It was in Helmand also that I met Khair Mahmad, a toothless and partly deaf old man who had turned a corner of his simple stone house into a sanctuary of knowledge. The high school where he taught had been bombed years before and was still open to the sky; classes were held in U.N. tents. Mahmad invited us home to lunch, but we were pressed for time and declined. Then, a few miles on our way back to Kabul, our vehicle had a flat tire and we limped back to the area’s only gas station, which turned out to be near Mahmad’s house. When we entered it, his family was eating a lunch of potatoes and eggs on the patio, and the old man leapt up to make room for us. Then he asked, a bit shyly, if we would like to see his study. I was impatient to leave, but assented out of courtesy. He led us up some stairs to a small room that seemed to glow with light. Every wall was covered with poems, Koranic verses and colored drawings of plants and animals. “Possessions are temporary but education is forever,” read one Islamic saying. Mahmad had perhaps a ninth-grade education, but he was the most knowledgeable man in his village, and for him it was a sacred responsibility. I felt humbled to have met him, and grateful for the flat tire that had led me to his secret shrine. It was at such moments that I remembered why I was a journalist and why I had come to Afghanistan. It was in such places that I felt hope for the country’s future, despite the bleak statistics, the unaddressed human rights abuses, the seething ethnic rivalries, the widening cancer of corruption and drugs, and the looming struggle between the nation’s conservative Islamic soul and its compelling push to modernize. When election day finally arrived, international attention focused on allegations of fraud at the polls, threats of Taliban sabotage and opposition sniping at Karzai’s advantages. In the end, as had been widely predicted, the president won handily over 17 rivals about whom most voters knew almost nothing. But at an important level, many Afghans who cast their ballots were not voting for an individual. They were voting for the right to choose their leaders, and for a system where men with guns did not decide their fates. I had read all the dire reports; I knew things could still fall apart. Although the election was remarkably free of violence, a number of terrorist bombings and kidnappings struck the capital in the weeks that followed. But as I completed my tour of duty and prepared to return to the world of hot water and bright lights, smooth roads and electronic voting booths, I preferred to think of that chilly village schoolhouse and the face of that young farmer, poking a ballot into a plastic box and smiling to himself as he strode out of the room, wrapping his shawl a little tighter against the cold autumn wind.
de075a2670d407d3d679a6c456a9838c
https://www.smithsonianmag.com/history/atlanta-famed-cyclorama-tell-truth-civil-war-once-again-180970715/
When I was a little boy growing up in South Carolina, my mom decided to take me and a neighborhood girl on a big history trip and visit the sights in Atlanta. Emphasis on the big. We saw Stone Mountain, the half-finished Confederate rival of Mount Rushmore. And at some point I recall clicking through the turnstile of a massive building at the Atlanta Zoo to see something amazing, “the largest painting in the world. I wish I could remember anything other than that everything felt dank in there, like a long unvisited cellar, but the thing was, as promised, insanely big. It was called the Cyclorama, and the canvas was suspended around the 360 degrees of a high circular wall, showing hundreds of clashing soldiers. If I had listened to the guide, I might have heard that here was a great Confederate victory in the Civil War, depicted in images almost three stories high and more than a football field long. And I would have learned of its mysterious origin—how in the 1890s, a circus came to town with this spectacular visual entertainment and some exotic animals. But the circus went bankrupt, and everything that I was looking at—this big canvas and all the animals—had washed up here, in Atlanta’s Grant Park. All of that is an exaggeration, of course. It’s not the largest painting in the world, although it’s up there; and while it’s huge, those dimensions are mostly hyped. The painting depicts the Battle of Atlanta, a decisive Union victory in 1864. And the story of the Cyclorama’s journey is no carnival tale but more a Homeric odyssey for a canvas that got touched up and repainted as it got kicked farther and farther south until it was marooned in the Atlanta Zoo. To gaze upon the painting today—restored, reinstalled and reopening in February at the Atlanta History Center—is to see an unintended monument to the wonderments of accretion: accretions not merely of paint, but of mythmaking, distortion, error, misinterpretation, politics, opportunism, crowd-pleasing, revisionism, marketing, propaganda and cover-up (literally). Only a few years ago, the attraction seemed done for. Attendance was down to stragglers, and the city was hemorrhaging money. The future of the big canvas seemed to be a storage bin somewhere and, after some time, the dustbin. But then a few folks in Atlanta realized that restoring the painting would not only resurrect one of the more curious visual illusions of the 1880s, but also show, in the paint in front of your eyes, a neat timeline of the many shifts in Southern history since Appomattox. This was no mere cyclorama. What the saviors had on their hands was, ladies and gentlemen, the largest palimpsest of Civil War memory to be found anywhere on planet Earth—the Atlanta Cyclorama, one of the great wonders of the postmodern world. Cycloramas were a big popular entertainment once upon a time, and the way it worked was this: Once you entered the big building you would typically proceed to a staircase that you walked up, to a platform located in the dead center of a painting, completely encircling you. The canvas was slightly bowed away from the wall, and the horizon line of the painting’s action was at the viewer’s eye level. As much as a third of the top of the painting was sky painted increasingly dark to the top to create a sense of distance extending away. And the bottom of the canvas would often be packed up against a flooring of dirt with real bushes and maybe guns or campsites, all part of a ground-floor diorama that, in the limited lighting, caused the imagery in the painting to pop in the viewer’s mind as a kind of all-enveloping 3-D sensation. “It was the virtual reality of its day,” Gordon Jones, the curator at the Atlanta History Center, told me. The effect was like walking inside one of those stereoscopes, the early View-Masters of that time, that tricked the eye into perceiving space and distance. Standing on that platform was like sinking into this slight illusionary sense—in this case, that you were a commander on a hill taking in the battle at hand. Beginning in the 1880s, these completely circular paintings started appearing from half a dozen companies, such as the American Panorama Company in Milwaukee, where Atlanta’s canvas was conceived. APC employed more than a dozen German painters, led by a Leipzig native named Friedrich Heine. Cycloramas could depict any great moment in history, but, for a few years in the 1880s, the timing was just right for Civil War battle scenes. A single generation had passed since the end of the Civil War and survivors everywhere were beginning to ask the older family members, what happened in the war? These giant paintings constituted the first time anyone in America encountered a sensation far more immersive than a magazine illustration or a Mathew Brady photograph—the illusion of seeing a full reality, the grand overview, viewed from on high—the big picture. In the heyday of this new medium, one might gain admission to see the Battle of Gettysburg, the Storming of Missionary Ridge and the Battle Above the Clouds, or the Merrimac and Monitor Naval Battle. For a change of pace, maybe you’d catch Custer’s Last Stand, the Great Chicago Fire or Christ’s Triumphal Entrance to Jerusalem. The Battle of Atlanta Cyclorama was significant because it captured this one moment of the Civil War when everything changed. That midsummer of the war’s fourth year, Northern voters were losing interest, Lincoln’s popularity was sinking, an election was coming up and all news from the battlefields had been bad. Then, in an instant, momentum turned around. Atlanta was defeated, and afterward, Gen. William Tecumseh Sherman turned east for the long march that ended the war. But this battle almost went the other way, especially at one key moment—4:45 p.m. on July 22, 1864. On the railroad line just outside Atlanta, near a place called the Troup Hurt House, the Union Army had set up a trench line with artillery commanded by Capt. Francis DeGress. Rebels broke that line and were heading to take on the Yankee troops until Gen. John “Black Jack” Logan counterattacked and pushed the Confederates back. “If you are going to have a battle scene, you don’t paint a walkover, right?” explained Jones. “You don’t make it a 42-0 rout. There’s no glory in that. There’s glory when you win by a point with a field goal in the last second of overtime. So, this is that moment.” The Battle of Atlanta Cyclorama opened in Minneapolis, to a Northern audience in the summer of 1886. A few weeks later, a local newspaper reported that General Sherman declared it to be “the best picture of a battle on exhibition in this country.” Part of its allure was not just the cognitive effect of a 3-D sensation, but also the accuracy of detail. The Milwaukee Germans interviewed lots of Union veterans, they traveled to Atlanta to sketch locations and they spoke to Confederates. In the studio, helping out, was Theodore Davis, war illustrator for Harper’s Weekly, who was on the field that July 22. (The Germans thanked Davis by painting him on horseback just behind a covered-wagon ambulance.) The pinpoint accuracies on the canvas were impressive—the weaponry on the field, the uniforms by rank and even details down to the sleigh-like cut of an artillery driver’s saddle. For the vets, there were specific commanders visible among the vast battle confusion, recognizable on the canvas. Gen. James Morgan, Gen. Joseph Lightburn and Gen. James McPherson, lying in the covered-wagon ambulance, where he would die of his wounds. General Sherman can be spotted on a far hill, overseeing the maneuvers, but the biggest, most recognizable figure is Gen. Black Jack Logan. The painters of the day made him huge because they knew who they were painting for, which is also why there are no recognizable Confederates in the painting. But in Minnesota, where the tour of the painting began, they knew Logan would draw the crowds. “He had star power,” Jones said. As a postwar civilian, Logan would become even more prominent, eventually picked by James Blaine in 1884 as his vice presidential nominee. But most important, in the North, soldiers loved him. “They knew Logan,” Jones said. “He was right up there. If he’s not Jesus or Moses, he’s Abraham.” The Cyclorama was a big moneymaker. Crowds packed the rotundas to see a battle, and veterans were full of pride to point out to family members “where I was.” Politicians spotted a media opportunity. The Republican nominee in 1888 was Indiana’s Benjamin Harrison, and although he had not fought in the Battle of Atlanta, he had been a few miles away a few days before. So, as the painting was prepped to travel to Indianapolis, some bright Harrison campaign operative convinced the manager of the Cyclorama to paint over the figure of Harper’s Weekly illustrator Theodore Davis on the battlefield, and make him into Gen. Benjamin Harrison. Soon enough, the Indiana papers encouraged customers to see the new Cyclorama, which suddenly appeared to have a new name. “HARRISON AT ATLANTA,” the ads screamed. Harrison lost the popular vote that November, but in the electoral college, he won—thanks in part to the votes in Indiana and neighboring states. When the Harrison touch-up was exposed in the press, the revelation was an embarrassment for everyone. Stolen valor was a thing then, too. But it was the Cyclorama manager who suffered most. He resigned in disgrace, while Harrison would stay mounted upon that horse for more than a century. Jones recently put Theodore Davis back in his saddle, his rightful place documented in very early photos of the original image. “The hierarchy of our thinking,” Jones said, “is to restore the illusion intended by the artist.” But throughout the canvas, Jones added, there are “exceptions, too”—changes that tell other stories, and they will stay. No one thinks of the late 19th century as a frantic time of new media, but by 1890, magic lantern shows were popular and the big leap in virtual reality, movies, was only a few years off. So after only a couple of years of popularity, the easy money in cycloramas had been made; time for the smart investors to sell off while the getting’s good. The Battle of Atlanta went on the block that year and sold to a Georgian named Paul Atkinson. He was a semi-successful barker, a poor man’s P.T. Barnum. The youngest brother of four Confederate soldiers, Atkinson was known for managing the career of his wife, Lulu Hurst. She performed alleged feats of strength onstage, claiming that she gained her superpowers after a mysterious encounter with an electrical storm—performing under names like “The Magnetical Electrical Georgia Girl” and “The Amazing Wonder of the Nineteenth Century.” As Atkinson prepared the painting for its next move—to Chattanooga, in 1891—he saw that there was something sublimely marketable in the moment the Germans chose to paint. The Southern surge, and the Union counterattack—the battle really was this perfect Schrödinger moment when the South was not yet the loser and the Union not yet the winner. Atkinson’s heyday as a promoter was also when the South’s attempted rewrites of the war began to solidify into the first chapter of what we now call the Lost Cause. Slavery might have been the only cause discussed and written about before the war, but down South, that claim had long ago been talked out of the story. Now, the war was about principles of states’ rights and self-determination, but mostly it was about honor. Gen. Robert E. Lee’s shortcomings as a general and a slave owner were neatly marginalized in veterans’ magazines and memorial speeches. The Union generals all had makeovers as monsters—Benjamin Butler, the Beast; William Sherman, the Butcher. Meanwhile, Confederate leaders had all been airbrushed into high-minded men with chiseled profiles. The focus was now on brilliant military strategy, revealing a scrappy Confederacy fighting with fewer resources but fighting with honor. So Atkinson saw a problem with his new acquisition. Because the painting had been done originally for Northern vets, there were a few images that were obviously meant to tip the meaning of the entirety of the canvas. And there was one image in particular that would not jibe with the new Lost Cause view of things. It was that scene, just off from the counterattack, where one could see some Rebels in gray being taken prisoner. And in the hand of one of the Union soldiers was a humbled Confederate flag. POW’s, a captured flag—these are the emblems of weakness and dishonor. So, with some touches of blue paint, Atkinson turned a cowering band of Johnny Rebs into a pack of cowardly Billy Yanks, all running away from the fight. By the time the painting was moved to Atlanta in 1892, the newspaper made it even easier for everybody, announcing the arrival of the new Cyclorama and its depiction of the “only Confederate victory ever painted!” Still, ticket sales were tepid. Atkinson offloaded his mistake to one Atlanta investor who then pawned it off to another; in 1893, the painting was sold for a mere $937. Around the country, the cyclorama fad was over. As the years passed, the Battle of Atlanta suffered. Roof timbers in one location crashed through and damaged the painting, and when it was finally moved to Grant Park in 1893, it sat outside in the weather for four weeks before being moved into the new building. And when they finally hung the thing, it was discovered the site was too small, so the new owners razored a sizable vertical chunk out of the decaying canvas to make it fit. The decline of interest in battlefield specifics also segued easily into the latest shift in Lost Cause emphasis. After the collapse of Reconstruction, the two sides of the war finally did heal into a single nation, but the new union was forged by a common embrace of white supremacy. Jim Crow laws were passed in the South and segregation became the accepted way, from Maine to Florida and straight across to California. Every surge of resistance from black Americans was met with a counterassault of grotesque violence. Beginning roughly in 1890, an African-American was lynched, burned alive, or mutilated every week for the next 50 years. The rearrangement of a nation founded on the idea of equality into a country with a permanent second class meant re-domesticating the slaveholding planter philosophy of how things should be. Blacks would be relegated to a segregated economy, but this time, a more folksy sense of supremacy was also promulgated, a kind of Southern lifestyle every region of America could enjoy. The popularization of the Confederate rectangular Navy Jack flag would serve to rebrand the South as this distinctive place, home of a new easygoing racism. Now, everyone could have an Aunt Jemima cook you pancakes in the morning, and faithful retainer Uncle Ben serve the converted rice at dinner. They were right there on the boxes at the local grocery, available for purchase. This new story also meant reshaping the forced-labor camp of cotton production into the romantic splendor of the plantation mansion, rebuilt as a magnolia Arcadia of neo-Georgian architecture (a lovely wedding-event destination, available for rental). No media event was more responsible for cementing these new factoids into the minds of Americans than Gone With the Wind—a 1939 movie that distills the South into a cozy racial lifestyle while utterly marginalizing the Civil War. In the movie’s four-hour running time, there is a not a single battle scene. The technical adviser largely responsible for the entire look and feel of that movie was Wilbur Kurtz, an Illinois-born painter who moved to Atlanta as a young man. He married the daughter of a railroad officer who worked with the Confederacy during the war.* Like so many eager transplants, Kurtz became more Southern than any other Southerner. And in those years before Gone With the Wind was released, during the 1930s, the city of Atlanta asked Wilbur Kurtz to restore the dilapidated Cyclorama. Kurtz was known as an illustrator in the newspapers and often would draw pictures for popular books at the time. A typical Kurtz illustration, for a book called Maum Nancy, shows an old white man seated as his liveried maid presents his libation on a silver platter. “There stood Nancy, carrying a tall glass of mint julep,” the caption reads. So, for Kurtz, restoring the Cyclorama also meant brightening things up here and there. In the canvas, for reasons lost to history, there had been a few flags showing St. Andrew’s Cross, the red cross on the white field that eventually became the state flag of Alabama. Kurtz overpainted them with the new signifier of Southern heritage—the rectangular Navy Jack of the Confederate States. By the end, he added 15 of the Navy Jack flags, and painted in nearly a dozen new Confederate soldiers. And there was a kind of Hollywoodification by way of Kurtz, too. He recruited NBC radio announcer John Fulton to read a script over a sound system. The tour of the Cyclorama now began with a triumphant recording of “Dixie.” When Clark Gable and Vivien Leigh came to Atlanta for the premiere of the movie (no black actors allowed, of course), the celebrities visited Kurtz’s Cyclorama. Gable, according to legend, is said to have told Kurtz he loved everything about the big painting except one thing: “I am not in it.” Soon enough, Kurtz had one of the floor mannequins recast to look exactly like a fallen Rhett Butler. These direct plays to the audience may have worked for a while, but the appeal was short-lived. Interest in the war itself, the movements of troops, surges and countersurges, the Battle of Atlanta, had long ago become the province of Civil War aficionados, those guys who buy all the new Civil War books and schedule their social lives around the next re-enactment. But the general public was more interested in the zoo. Then, a funny thing happened on the way to the future: the Voting Rights Act. By the early 1970s, certain city council members were pushing to have the Battle of Atlanta, properly understood as a Confederate victory, taken to Stone Mountain to become part of a neo-Confederate relic jamboree that is hosted up there. But by then, the mayor of Atlanta was Maynard Jackson, the first African-American to hold that office, and he had an “Emperor Has No Clothes” moment. Amid new legislation to relocate the canvas, he simply looked at the painting, saw what it was, and said so out loud. “The Cyclorama depicts the Battle of Atlanta, a battle that the right side won,” he explained in 1979, “a battle that helped free my ancestors.” Jackson added, “I’ll make sure that that depiction is saved.” In the last year or so since the neo-Confederate violence in Charlottesville, Virgina, discussions around the country have centered on “recontextualizing” Confederate statues and memorials. One can easily argue that Maynard Jackson was the first politician to make this case because, with his intervention, the Cyclorama was saved, this time with a new script for the sound system, voiced by James Earl Jones. * * * By 2011, though, the Cyclorama was again in shabby condition, a moth-eaten relic that a new mayor wanted to trash. “He put it on his list of city-owned assets that he viewed as white elephants,” said Sheffield Hale, who chaired the committee to decide how to dispose of things like the Cyclorama. Downtown was now host to all kinds of buzzy attractions invoking the New Atlanta—the College Football Hall of Fame, the World of Coca-Cola, the Center for Civil and Human Rights. There were recommendations to hang the old canvas near Underground Atlanta, the shopping district, or maybe finally put it in that storage bin, wait a few decades, and throw it away. That story hit the Atlanta Constitution on a Sunday in 2013 and one of the city’s most successful real estate moguls, Lloyd Whitaker, was reading the paper just before heading off to church. In fact, his firm, Newleaf, is typically described as a real estate turnaround company, and in that sense he saw the Cyclorama as something different—an object that drew a line from the new-media dreams of those German painters to Mayor Jackson’s epiphany. “The Battle of Atlanta was the death knell of the Confederacy,” Whitaker told an Atlanta blogger. “We are going to be able to preserve that in the literal sense with the painting, and symbolically with how that led to the civil rights movement.” Around that same time Hale took a job at the Atlanta History Center, located in the city’s affluent Buckhead district. Whitaker offered $10 million as a lead legacy, and an incentive to raise even more money. Hale recognized right away how a new context for a cheesy 1880 spectacle could be created. “This was not an attraction,” Gordon Jones, the History Center curator, told me, “this was an artifact.” “We ended up raising $25 million more to construct the building, restore the painting and do the exhibits,” Hale said. “We had the ability to really deal with the history of the painting and the Lost Cause and all that is wrapped up in the irony of the painting—and turn it into a different object.” Hale and Jones are restoring the painting according to the documentary history recorded by the German artists in 1886. They want to recapture the original optical effect as well, with attention to scale and lighting. But they are also filling back in elements snipped out, painted over or otherwise altered over the years. Those Confederate captives, reimagined as fleeing Unionists by Atkinson, will again be shown as prisoners. And another image added by Atkinson, that of a Union flag ground into the mud, will be expunged. The story of those changing nuances in oil is presented as a narrative in two mediums. From the observation stage, a late 19th-century visual spectacle, restored to its full impact, offers an immersive experience of a pivotal battle. Down below, beneath the viewing level, extensive wall text lays out a detailed account of how the painting was revised to reflect mutable interpretations of the past. Even the story of how the History Center moved the painting from its previous location signaled its new status as a highly symbolic relic. Strips of stabilizing canvas were adhered to the back of the 42-foot-high deteriorating canvas. In order to extract it from the domed building at the zoo, the painting had to be cut in half and rolled into two separate vertical pillars. A crane lifted each pillar straight out of a seven-foot-diameter hole cut into the rotunda—a Brobdingnagian illuminated manuscript revealing the changing history of Southern identity. Once the painting had been transported and unfurled, restorationists could begin their work. Fittingly, the winning bid went to a firm from Germany—Weilhammer & Schoeller. Uli Weilhammer showed me around the hall where a half-dozen artists, standing in pulley-suspended lifts, applied their skills. “You can’t put this painting on a table and work on it,” he said. He pointed out a depiction of a seemingly misshapen soldier at the bottom of the canvas and then walked up the stairs to the viewing platform. “As a conservator, you have to adjust for the distance,” he said. “You are painting on curved canvas.” He gestured below. “Look at some of these figures, close up they are quite distorted, they work only from here, from this perspective.” Weilhammer indicated where that 56-inch-wide chunk of painting had been sliced out of the canvas a century ago. The section will be replaced, the painted sequence based on photographs of the original. Seven feet of new canvas, showing painstakingly recreated blue sky, has been added to return the panorama to its original dimensions. A high receding skyline is crucial to make the slightly bowed foreground feel like a three-dimensional landscape. This new, reconceived Cyclorama is a monumental pageant that took a slow-motion flash mob of painters, politicians, promoters, propagandists and restorationists 140 years to complete—a multilayered artifact that tells the episodic tale of the Old South’s evolution. It now measures 371.2 feet long and 49 feet high and weighs 9,400 pounds—no exaggeration. Editor's Note, February 28, 2019: An earlier version of this story misstated the occupation of Wilbur Kurtz’s father-in-law. He was not officially a Confederate officer, but worked with the Confederacy during the Civil War. This article is a selection from the December issue of Smithsonian magazine
2de9fbb50b67e2e4ebe016dd83c59811
https://www.smithsonianmag.com/history/atlanta-famed-cyclorama-tell-truth-civil-war-once-again-180970715/?utm_source=facebook.com&utm_medium=socialmedia&fbclid=IwAR3-t6kTFrCZ1i_6PIaFIRPQE_DwrfnO4JGHS-m2qmoH7MJLGyfXXqWtJa8
When I was a little boy growing up in South Carolina, my mom decided to take me and a neighborhood girl on a big history trip and visit the sights in Atlanta. Emphasis on the big. We saw Stone Mountain, the half-finished Confederate rival of Mount Rushmore. And at some point I recall clicking through the turnstile of a massive building at the Atlanta Zoo to see something amazing, “the largest painting in the world. I wish I could remember anything other than that everything felt dank in there, like a long unvisited cellar, but the thing was, as promised, insanely big. It was called the Cyclorama, and the canvas was suspended around the 360 degrees of a high circular wall, showing hundreds of clashing soldiers. If I had listened to the guide, I might have heard that here was a great Confederate victory in the Civil War, depicted in images almost three stories high and more than a football field long. And I would have learned of its mysterious origin—how in the 1890s, a circus came to town with this spectacular visual entertainment and some exotic animals. But the circus went bankrupt, and everything that I was looking at—this big canvas and all the animals—had washed up here, in Atlanta’s Grant Park. All of that is an exaggeration, of course. It’s not the largest painting in the world, although it’s up there; and while it’s huge, those dimensions are mostly hyped. The painting depicts the Battle of Atlanta, a decisive Union victory in 1864. And the story of the Cyclorama’s journey is no carnival tale but more a Homeric odyssey for a canvas that got touched up and repainted as it got kicked farther and farther south until it was marooned in the Atlanta Zoo. To gaze upon the painting today—restored, reinstalled and reopening in February at the Atlanta History Center—is to see an unintended monument to the wonderments of accretion: accretions not merely of paint, but of mythmaking, distortion, error, misinterpretation, politics, opportunism, crowd-pleasing, revisionism, marketing, propaganda and cover-up (literally). Only a few years ago, the attraction seemed done for. Attendance was down to stragglers, and the city was hemorrhaging money. The future of the big canvas seemed to be a storage bin somewhere and, after some time, the dustbin. But then a few folks in Atlanta realized that restoring the painting would not only resurrect one of the more curious visual illusions of the 1880s, but also show, in the paint in front of your eyes, a neat timeline of the many shifts in Southern history since Appomattox. This was no mere cyclorama. What the saviors had on their hands was, ladies and gentlemen, the largest palimpsest of Civil War memory to be found anywhere on planet Earth—the Atlanta Cyclorama, one of the great wonders of the postmodern world. Cycloramas were a big popular entertainment once upon a time, and the way it worked was this: Once you entered the big building you would typically proceed to a staircase that you walked up, to a platform located in the dead center of a painting, completely encircling you. The canvas was slightly bowed away from the wall, and the horizon line of the painting’s action was at the viewer’s eye level. As much as a third of the top of the painting was sky painted increasingly dark to the top to create a sense of distance extending away. And the bottom of the canvas would often be packed up against a flooring of dirt with real bushes and maybe guns or campsites, all part of a ground-floor diorama that, in the limited lighting, caused the imagery in the painting to pop in the viewer’s mind as a kind of all-enveloping 3-D sensation. “It was the virtual reality of its day,” Gordon Jones, the curator at the Atlanta History Center, told me. The effect was like walking inside one of those stereoscopes, the early View-Masters of that time, that tricked the eye into perceiving space and distance. Standing on that platform was like sinking into this slight illusionary sense—in this case, that you were a commander on a hill taking in the battle at hand. Beginning in the 1880s, these completely circular paintings started appearing from half a dozen companies, such as the American Panorama Company in Milwaukee, where Atlanta’s canvas was conceived. APC employed more than a dozen German painters, led by a Leipzig native named Friedrich Heine. Cycloramas could depict any great moment in history, but, for a few years in the 1880s, the timing was just right for Civil War battle scenes. A single generation had passed since the end of the Civil War and survivors everywhere were beginning to ask the older family members, what happened in the war? These giant paintings constituted the first time anyone in America encountered a sensation far more immersive than a magazine illustration or a Mathew Brady photograph—the illusion of seeing a full reality, the grand overview, viewed from on high—the big picture. In the heyday of this new medium, one might gain admission to see the Battle of Gettysburg, the Storming of Missionary Ridge and the Battle Above the Clouds, or the Merrimac and Monitor Naval Battle. For a change of pace, maybe you’d catch Custer’s Last Stand, the Great Chicago Fire or Christ’s Triumphal Entrance to Jerusalem. The Battle of Atlanta Cyclorama was significant because it captured this one moment of the Civil War when everything changed. That midsummer of the war’s fourth year, Northern voters were losing interest, Lincoln’s popularity was sinking, an election was coming up and all news from the battlefields had been bad. Then, in an instant, momentum turned around. Atlanta was defeated, and afterward, Gen. William Tecumseh Sherman turned east for the long march that ended the war. But this battle almost went the other way, especially at one key moment—4:45 p.m. on July 22, 1864. On the railroad line just outside Atlanta, near a place called the Troup Hurt House, the Union Army had set up a trench line with artillery commanded by Capt. Francis DeGress. Rebels broke that line and were heading to take on the Yankee troops until Gen. John “Black Jack” Logan counterattacked and pushed the Confederates back. “If you are going to have a battle scene, you don’t paint a walkover, right?” explained Jones. “You don’t make it a 42-0 rout. There’s no glory in that. There’s glory when you win by a point with a field goal in the last second of overtime. So, this is that moment.” The Battle of Atlanta Cyclorama opened in Minneapolis, to a Northern audience in the summer of 1886. A few weeks later, a local newspaper reported that General Sherman declared it to be “the best picture of a battle on exhibition in this country.” Part of its allure was not just the cognitive effect of a 3-D sensation, but also the accuracy of detail. The Milwaukee Germans interviewed lots of Union veterans, they traveled to Atlanta to sketch locations and they spoke to Confederates. In the studio, helping out, was Theodore Davis, war illustrator for Harper’s Weekly, who was on the field that July 22. (The Germans thanked Davis by painting him on horseback just behind a covered-wagon ambulance.) The pinpoint accuracies on the canvas were impressive—the weaponry on the field, the uniforms by rank and even details down to the sleigh-like cut of an artillery driver’s saddle. For the vets, there were specific commanders visible among the vast battle confusion, recognizable on the canvas. Gen. James Morgan, Gen. Joseph Lightburn and Gen. James McPherson, lying in the covered-wagon ambulance, where he would die of his wounds. General Sherman can be spotted on a far hill, overseeing the maneuvers, but the biggest, most recognizable figure is Gen. Black Jack Logan. The painters of the day made him huge because they knew who they were painting for, which is also why there are no recognizable Confederates in the painting. But in Minnesota, where the tour of the painting began, they knew Logan would draw the crowds. “He had star power,” Jones said. As a postwar civilian, Logan would become even more prominent, eventually picked by James Blaine in 1884 as his vice presidential nominee. But most important, in the North, soldiers loved him. “They knew Logan,” Jones said. “He was right up there. If he’s not Jesus or Moses, he’s Abraham.” The Cyclorama was a big moneymaker. Crowds packed the rotundas to see a battle, and veterans were full of pride to point out to family members “where I was.” Politicians spotted a media opportunity. The Republican nominee in 1888 was Indiana’s Benjamin Harrison, and although he had not fought in the Battle of Atlanta, he had been a few miles away a few days before. So, as the painting was prepped to travel to Indianapolis, some bright Harrison campaign operative convinced the manager of the Cyclorama to paint over the figure of Harper’s Weekly illustrator Theodore Davis on the battlefield, and make him into Gen. Benjamin Harrison. Soon enough, the Indiana papers encouraged customers to see the new Cyclorama, which suddenly appeared to have a new name. “HARRISON AT ATLANTA,” the ads screamed. Harrison lost the popular vote that November, but in the electoral college, he won—thanks in part to the votes in Indiana and neighboring states. When the Harrison touch-up was exposed in the press, the revelation was an embarrassment for everyone. Stolen valor was a thing then, too. But it was the Cyclorama manager who suffered most. He resigned in disgrace, while Harrison would stay mounted upon that horse for more than a century. Jones recently put Theodore Davis back in his saddle, his rightful place documented in very early photos of the original image. “The hierarchy of our thinking,” Jones said, “is to restore the illusion intended by the artist.” But throughout the canvas, Jones added, there are “exceptions, too”—changes that tell other stories, and they will stay. No one thinks of the late 19th century as a frantic time of new media, but by 1890, magic lantern shows were popular and the big leap in virtual reality, movies, was only a few years off. So after only a couple of years of popularity, the easy money in cycloramas had been made; time for the smart investors to sell off while the getting’s good. The Battle of Atlanta went on the block that year and sold to a Georgian named Paul Atkinson. He was a semi-successful barker, a poor man’s P.T. Barnum. The youngest brother of four Confederate soldiers, Atkinson was known for managing the career of his wife, Lulu Hurst. She performed alleged feats of strength onstage, claiming that she gained her superpowers after a mysterious encounter with an electrical storm—performing under names like “The Magnetical Electrical Georgia Girl” and “The Amazing Wonder of the Nineteenth Century.” As Atkinson prepared the painting for its next move—to Chattanooga, in 1891—he saw that there was something sublimely marketable in the moment the Germans chose to paint. The Southern surge, and the Union counterattack—the battle really was this perfect Schrödinger moment when the South was not yet the loser and the Union not yet the winner. Atkinson’s heyday as a promoter was also when the South’s attempted rewrites of the war began to solidify into the first chapter of what we now call the Lost Cause. Slavery might have been the only cause discussed and written about before the war, but down South, that claim had long ago been talked out of the story. Now, the war was about principles of states’ rights and self-determination, but mostly it was about honor. Gen. Robert E. Lee’s shortcomings as a general and a slave owner were neatly marginalized in veterans’ magazines and memorial speeches. The Union generals all had makeovers as monsters—Benjamin Butler, the Beast; William Sherman, the Butcher. Meanwhile, Confederate leaders had all been airbrushed into high-minded men with chiseled profiles. The focus was now on brilliant military strategy, revealing a scrappy Confederacy fighting with fewer resources but fighting with honor. So Atkinson saw a problem with his new acquisition. Because the painting had been done originally for Northern vets, there were a few images that were obviously meant to tip the meaning of the entirety of the canvas. And there was one image in particular that would not jibe with the new Lost Cause view of things. It was that scene, just off from the counterattack, where one could see some Rebels in gray being taken prisoner. And in the hand of one of the Union soldiers was a humbled Confederate flag. POW’s, a captured flag—these are the emblems of weakness and dishonor. So, with some touches of blue paint, Atkinson turned a cowering band of Johnny Rebs into a pack of cowardly Billy Yanks, all running away from the fight. By the time the painting was moved to Atlanta in 1892, the newspaper made it even easier for everybody, announcing the arrival of the new Cyclorama and its depiction of the “only Confederate victory ever painted!” Still, ticket sales were tepid. Atkinson offloaded his mistake to one Atlanta investor who then pawned it off to another; in 1893, the painting was sold for a mere $937. Around the country, the cyclorama fad was over. As the years passed, the Battle of Atlanta suffered. Roof timbers in one location crashed through and damaged the painting, and when it was finally moved to Grant Park in 1893, it sat outside in the weather for four weeks before being moved into the new building. And when they finally hung the thing, it was discovered the site was too small, so the new owners razored a sizable vertical chunk out of the decaying canvas to make it fit. The decline of interest in battlefield specifics also segued easily into the latest shift in Lost Cause emphasis. After the collapse of Reconstruction, the two sides of the war finally did heal into a single nation, but the new union was forged by a common embrace of white supremacy. Jim Crow laws were passed in the South and segregation became the accepted way, from Maine to Florida and straight across to California. Every surge of resistance from black Americans was met with a counterassault of grotesque violence. Beginning roughly in 1890, an African-American was lynched, burned alive, or mutilated every week for the next 50 years. The rearrangement of a nation founded on the idea of equality into a country with a permanent second class meant re-domesticating the slaveholding planter philosophy of how things should be. Blacks would be relegated to a segregated economy, but this time, a more folksy sense of supremacy was also promulgated, a kind of Southern lifestyle every region of America could enjoy. The popularization of the Confederate rectangular Navy Jack flag would serve to rebrand the South as this distinctive place, home of a new easygoing racism. Now, everyone could have an Aunt Jemima cook you pancakes in the morning, and faithful retainer Uncle Ben serve the converted rice at dinner. They were right there on the boxes at the local grocery, available for purchase. This new story also meant reshaping the forced-labor camp of cotton production into the romantic splendor of the plantation mansion, rebuilt as a magnolia Arcadia of neo-Georgian architecture (a lovely wedding-event destination, available for rental). No media event was more responsible for cementing these new factoids into the minds of Americans than Gone With the Wind—a 1939 movie that distills the South into a cozy racial lifestyle while utterly marginalizing the Civil War. In the movie’s four-hour running time, there is a not a single battle scene. The technical adviser largely responsible for the entire look and feel of that movie was Wilbur Kurtz, an Illinois-born painter who moved to Atlanta as a young man. He married the daughter of a railroad officer who worked with the Confederacy during the war.* Like so many eager transplants, Kurtz became more Southern than any other Southerner. And in those years before Gone With the Wind was released, during the 1930s, the city of Atlanta asked Wilbur Kurtz to restore the dilapidated Cyclorama. Kurtz was known as an illustrator in the newspapers and often would draw pictures for popular books at the time. A typical Kurtz illustration, for a book called Maum Nancy, shows an old white man seated as his liveried maid presents his libation on a silver platter. “There stood Nancy, carrying a tall glass of mint julep,” the caption reads. So, for Kurtz, restoring the Cyclorama also meant brightening things up here and there. In the canvas, for reasons lost to history, there had been a few flags showing St. Andrew’s Cross, the red cross on the white field that eventually became the state flag of Alabama. Kurtz overpainted them with the new signifier of Southern heritage—the rectangular Navy Jack of the Confederate States. By the end, he added 15 of the Navy Jack flags, and painted in nearly a dozen new Confederate soldiers. And there was a kind of Hollywoodification by way of Kurtz, too. He recruited NBC radio announcer John Fulton to read a script over a sound system. The tour of the Cyclorama now began with a triumphant recording of “Dixie.” When Clark Gable and Vivien Leigh came to Atlanta for the premiere of the movie (no black actors allowed, of course), the celebrities visited Kurtz’s Cyclorama. Gable, according to legend, is said to have told Kurtz he loved everything about the big painting except one thing: “I am not in it.” Soon enough, Kurtz had one of the floor mannequins recast to look exactly like a fallen Rhett Butler. These direct plays to the audience may have worked for a while, but the appeal was short-lived. Interest in the war itself, the movements of troops, surges and countersurges, the Battle of Atlanta, had long ago become the province of Civil War aficionados, those guys who buy all the new Civil War books and schedule their social lives around the next re-enactment. But the general public was more interested in the zoo. Then, a funny thing happened on the way to the future: the Voting Rights Act. By the early 1970s, certain city council members were pushing to have the Battle of Atlanta, properly understood as a Confederate victory, taken to Stone Mountain to become part of a neo-Confederate relic jamboree that is hosted up there. But by then, the mayor of Atlanta was Maynard Jackson, the first African-American to hold that office, and he had an “Emperor Has No Clothes” moment. Amid new legislation to relocate the canvas, he simply looked at the painting, saw what it was, and said so out loud. “The Cyclorama depicts the Battle of Atlanta, a battle that the right side won,” he explained in 1979, “a battle that helped free my ancestors.” Jackson added, “I’ll make sure that that depiction is saved.” In the last year or so since the neo-Confederate violence in Charlottesville, Virgina, discussions around the country have centered on “recontextualizing” Confederate statues and memorials. One can easily argue that Maynard Jackson was the first politician to make this case because, with his intervention, the Cyclorama was saved, this time with a new script for the sound system, voiced by James Earl Jones. * * * By 2011, though, the Cyclorama was again in shabby condition, a moth-eaten relic that a new mayor wanted to trash. “He put it on his list of city-owned assets that he viewed as white elephants,” said Sheffield Hale, who chaired the committee to decide how to dispose of things like the Cyclorama. Downtown was now host to all kinds of buzzy attractions invoking the New Atlanta—the College Football Hall of Fame, the World of Coca-Cola, the Center for Civil and Human Rights. There were recommendations to hang the old canvas near Underground Atlanta, the shopping district, or maybe finally put it in that storage bin, wait a few decades, and throw it away. That story hit the Atlanta Constitution on a Sunday in 2013 and one of the city’s most successful real estate moguls, Lloyd Whitaker, was reading the paper just before heading off to church. In fact, his firm, Newleaf, is typically described as a real estate turnaround company, and in that sense he saw the Cyclorama as something different—an object that drew a line from the new-media dreams of those German painters to Mayor Jackson’s epiphany. “The Battle of Atlanta was the death knell of the Confederacy,” Whitaker told an Atlanta blogger. “We are going to be able to preserve that in the literal sense with the painting, and symbolically with how that led to the civil rights movement.” Around that same time Hale took a job at the Atlanta History Center, located in the city’s affluent Buckhead district. Whitaker offered $10 million as a lead legacy, and an incentive to raise even more money. Hale recognized right away how a new context for a cheesy 1880 spectacle could be created. “This was not an attraction,” Gordon Jones, the History Center curator, told me, “this was an artifact.” “We ended up raising $25 million more to construct the building, restore the painting and do the exhibits,” Hale said. “We had the ability to really deal with the history of the painting and the Lost Cause and all that is wrapped up in the irony of the painting—and turn it into a different object.” Hale and Jones are restoring the painting according to the documentary history recorded by the German artists in 1886. They want to recapture the original optical effect as well, with attention to scale and lighting. But they are also filling back in elements snipped out, painted over or otherwise altered over the years. Those Confederate captives, reimagined as fleeing Unionists by Atkinson, will again be shown as prisoners. And another image added by Atkinson, that of a Union flag ground into the mud, will be expunged. The story of those changing nuances in oil is presented as a narrative in two mediums. From the observation stage, a late 19th-century visual spectacle, restored to its full impact, offers an immersive experience of a pivotal battle. Down below, beneath the viewing level, extensive wall text lays out a detailed account of how the painting was revised to reflect mutable interpretations of the past. Even the story of how the History Center moved the painting from its previous location signaled its new status as a highly symbolic relic. Strips of stabilizing canvas were adhered to the back of the 42-foot-high deteriorating canvas. In order to extract it from the domed building at the zoo, the painting had to be cut in half and rolled into two separate vertical pillars. A crane lifted each pillar straight out of a seven-foot-diameter hole cut into the rotunda—a Brobdingnagian illuminated manuscript revealing the changing history of Southern identity. Once the painting had been transported and unfurled, restorationists could begin their work. Fittingly, the winning bid went to a firm from Germany—Weilhammer & Schoeller. Uli Weilhammer showed me around the hall where a half-dozen artists, standing in pulley-suspended lifts, applied their skills. “You can’t put this painting on a table and work on it,” he said. He pointed out a depiction of a seemingly misshapen soldier at the bottom of the canvas and then walked up the stairs to the viewing platform. “As a conservator, you have to adjust for the distance,” he said. “You are painting on curved canvas.” He gestured below. “Look at some of these figures, close up they are quite distorted, they work only from here, from this perspective.” Weilhammer indicated where that 56-inch-wide chunk of painting had been sliced out of the canvas a century ago. The section will be replaced, the painted sequence based on photographs of the original. Seven feet of new canvas, showing painstakingly recreated blue sky, has been added to return the panorama to its original dimensions. A high receding skyline is crucial to make the slightly bowed foreground feel like a three-dimensional landscape. This new, reconceived Cyclorama is a monumental pageant that took a slow-motion flash mob of painters, politicians, promoters, propagandists and restorationists 140 years to complete—a multilayered artifact that tells the episodic tale of the Old South’s evolution. It now measures 371.2 feet long and 49 feet high and weighs 9,400 pounds—no exaggeration. Editor's Note, February 28, 2019: An earlier version of this story misstated the occupation of Wilbur Kurtz’s father-in-law. He was not officially a Confederate officer, but worked with the Confederacy during the Civil War. This article is a selection from the December issue of Smithsonian magazine
d34e813a1a7c23cec4cbb7c44ca5ed1a
https://www.smithsonianmag.com/history/attempted-assassination-andrew-jackson-180962526/
The Attempted Assassination of Andrew Jackson
The Attempted Assassination of Andrew Jackson On January 30, 1835, politicians gathered in the Capitol Building for the funeral of South Carolina Representative Warren Davis. It was a dreary, misty day and onlookers observed that it was one of the rare occasions that could bring the fiercest of political rivals side by side on peaceable terms. But the peace wasn’t meant to last. President Andrew Jackson was among their number that day. At 67, Jackson had survived more than his fair share of maladies and mishaps—some of them self-provoked, such as the bullet lodged in his chest from a duel 30 years earlier. “General Jackson is extremely tall and thin, with a slight stoop, betokening more weakness than naturally belongs to his years,” wrote Harriet Martineau, a British social theorist, in her contemporaneous travelogue Retrospect of Western Travel. Six years into his presidency, Jackson had used bluster and fiery speeches to garner support for his emergent Democratic coalition. He used his veto power far more often than previous presidents, obstructing Congressional action and making political enemies in the process. Jackson’s apparent infirmity at the funeral belied his famous spitfire personality, which would shortly become apparent. As Jackson exited the East Portico at the end of the funeral, Richard Lawrence, an unemployed painter, accosted him. Lawrence pulled a Derringer pistol from his jacket, aimed at Jackson, and fired. Although the cap fired, the bullet failed to be discharged. As Lawrence withdrew a second pistol, Jackson charged his would-be assassin. “Let me alone! Let me alone!” he shouted. “I know where this came from.” He then attempted to batter the attacker with his cane. Lawrence fired his second gun—but this one, too, misfired. Within moments, Navy Lieutenant Thomas Gedney and Tennessee congressman Davy Crockett had subdued Lawrence and hurried the president off to a carriage so he could be transported to the White House. When Lawrence’s two pistols were later examined, both were found to be properly loaded and well functioning. They “fired afterwards without fail, carrying their bullets true and driving them through inch boards at thirty feet,” said U.S. Senator Thomas Hart Benton. An arms expert later calculated that the likelihood of both pistols misfiring was 125,000 to 1. It was the first attempt to assassinate a sitting president, and in the aftermath, attention was focused less on how to keep the President safe and more on the flinging of wild accusations. Jackson himself was convinced the attack was politically motivated, and charged rival politician George Poindexter with hiring Lawrence. No evidence was ever found of this, and Poindexter was cleared of all wrongdoing. “Before two hours were over, the name of almost every eminent politician was mixed up with that of the poor maniac who caused the uproar,” Martineau, who was at the Capitol building during the attack, wrote. Later that evening, she attended a party with the defiant president. “[Jackson] protested, in the presence of many strangers, that there was no insanity in the case,” Martineau observed. “I was silent, of course. He protested that there was a plot, and that the man was a tool, and at length quoted the Attorney-General as his authority. It was painful to hear a Chief Ruler publicly trying to persuade a foreigner that any of his constituents hated him to death: and I took the liberty of changing the subject as soon as I could.” Indeed, Lawrence’s insanity was fairly obvious. Not only did the painter believe the president had killed his father; he was also convinced he was 15th-century English king Richard III and was entitled to payments from his American colonies, and that Jackson had prevented him from receiving that money because he opposed reauthorizing the charter for the Second Bank of the United States. At the trial in April 1835, with attorney Francis Scott Key prosecuting, Lawrence announced to the jurors, “It is for me, gentlemen, to pass upon you, and not you upon me.” He was found not guilty by reason of insanity and confined to a hospital for the mentally ill until his death in 1861. But Jackson had good reason to think he had raised the ire of fellow politicians. “Jackson was ill-tempered, a fierce hater, unbending, dictatorial and vindictive,” writes Mel Ayton in Plotting to Kill the President. And one of Lawrence’s stated motives for the attack—Jackson’s opposition to the Second Bank of the U.S.—was a real source of political antagonism. In the years before the assassination attempt, Jackson came out swinging against the Bank of the United States (BUS). The chartered corporation was the second of its kind (the first was chartered in 1791 as the brainchild of Alexander Hamilton). When Congress allowed the charter on the first bank to expire in 1811, they quickly discovered how important a function it served: It issued currency, opened branches throughout the country, brokered loans if the U.S. needed to borrow money and moved money between banks. So in 1816, Congress passed a new, 20-year-long charter for the bank. “In the period of the 1820s, most observers thought the bank behaved responsibly. It served the government well and kept out of politics,” says historian Daniel Feller, editor of the Papers of Andrew Jackson. “In 1829, Jackson attacked the banks and that kind of startled everybody. He said it represented a dangerous concentration of power.” Jackson thought the bank represented the dangers of the wealthy aristocracy occupying a place of privilege in government that wasn’t accessible to average Americans. “[He] said, ‘It is to be regretted that the rich and powerful too often bend the acts of government to their selfish purposes.’ That’s his broader philosophical objection to the bank,” Feller says. In 1832, Congress passed a bill to preemptively re-charter the BUS. Jackson vetoed it, though the bank would remain in place for another four years. The veto became a major campaign issue when Jackson ran for reelection that year. Empowered by an overwhelming electoral victory over his opponent, Henry Clay, who believed the national bank allowed the federal government to manage the wellbeing of the country’s economy, Jackson decided to remove federal deposits (money that came from customs officers collecting revenue in ports and other government funds) and deposit them in state-chartered banks, which made it impossible for the bank to regulate the country’s currency. The move also further provoked Congress, whose members saw it as a huge overreach of executive power. In response to his move, the Senate censured Jackson in 1834 for “assuming power not conferred by the Constitution.” It was the first—and only—time the Senate ever censured a president. The back-and-forth battle became known as the Bank War. It transfixed the country, to the point where even someone with clear mental instability could easily reference it in his assassination attempt. In the end, Jackson won his war. The charter for the Second Bank expired in 1836 and the federal funds the president had diverted to state banks remained in their scattered locations. As for security around the White House and the Capitol, it remained much as it had been for the duration of Jackson’s term. Visitors were still allowed entry to the White House without any particular screening process. It would be another 26 years before another U.S. president, Abraham Lincoln was targeted for assassination, but a watchful security team thwarted the conspiracy. Four years later, they would not be so lucky Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
da378d5ad765ef0fd334b057d0fef5bf
https://www.smithsonianmag.com/history/axeman-new-orleans-preyed-italian-immigrants-180968037/
The Axeman of New Orleans Preyed on Italian Immigrants
The Axeman of New Orleans Preyed on Italian Immigrants By August of 1918, the city of New Orleans was paralyzed by fear. In the dead of the night, the Axeman of New Orleans (as he came to be known) broke into a series of Italian groceries, attacking the grocers and their families.  Some he left wounded; four people he left dead. The attacks were vicious. Joseph Maggio, for example, had his skull fractured with his own axe and his throat cut with a razor. His wife, Catherine, also had her throat cut; she asphyxiated on her own blood as she bled out. Several lethal attacks that didn’t target Italians were also thought to be the work of the Axeman although this would later prove not to be the case. Nevertheless, New Orleanians were terrified. The press noted that the Italian immigrant community was especially fearful, with panic-stricken men staying up all night to guard their families. New Orleans Superintendent of Police Frank Mooney suspected that the murderer was a “murderous degenerate … who gloats over blood.” The Axeman struck households in New Orleans from 1917 to March 1919. Then the killer crossed the crossed the Mississippi River to the neighboring town of Gretna. On the night of March 9, he assaulted Charlie Cortimiglia in the familiar fashion, badly injuring Charlie and his wife, Rosie, and killing their two-year-old daughter. Mooney believed this was the work of their “degenerate.” The Gretna authorities – Police Chief Peter Leson and Sheriff Louis Marrero – however, settled on the Cortimiglia’s next door neighbors, elderly Iorlando Jordano and his 17-year-old son Frank, as the culprits. As grocers, they were business competitors of the Cortimiglias and had recently taken them to court over a business dispute. Only thirty years after Jack the Ripper stalked the streets of Whitechapel, the Axeman of New Orleans held an American city hostage. The trouble was that no evidence implicated the Jordanos. The officials handled this inconvenience by haranguing the injured Cortimiglias as they lay in Charity Hospital, asking repeatedly, “Who hit you?” “Was it the Jordanos? Frank did it, didn’t he?” According to the doctor who treated her, Rosie always said that she didn’t know who had attacked her. When she was well enough to be released, Marrero immediately arrested Rosie as a material witness and incarcerated her in the Gretna jail. She was released only after she signed an affidavit implicating her neighbors. When Iorlando and Frank went on trial for their lives, the only evidence against them was Rosie’s identification, an identification that even her own physician thought unreliable. Yet, after a trial of less than a week, they were both convicted of murder. Sixty-nine-year-old Iorlando was sentenced to life imprisonment; Frank was to hang. Nine months later, Rosie walked into the newspaper office of the Times-Picayune and retracted her testimony. She said that St. Joseph had come to her in a dream, and told her she had to tell the truth. Rosie signed another affidavit, this time declaring that she hadn’t seen her attackers and had been pressured into identifying the Jordanos. Despite Rosie’s retraction, the prosecution didn’t immediately give up. At one point, Rosie was threatened with perjury charges if she didn’t stick to her original story. But finally, in December 1920, Iorlando and Frank walked free. Why were the Gretna authorities so quick to assume that neighbors, against whom there was no evidence, must have been the killers? Why were they so willing to ignore the advice of the New Orleans police chief, who had come to believe that there was a bloodthirsty fiend targeting Italian grocers? The Crescent City had known Italians from its earliest days, and an Italian business community established itself in the city well before the Civil War. These early arrivals hailed mostly from northern Italy, but it was the need for a cheap workforce in the late-19th century that led to the great influx of Sicilians into the state and the city and enticed men like Iorlando Jordano (Americanized from Guargliardo) to make the journey from Sicily to Louisiana. Sicilian laborers delighted the sugar planters of post-emancipation Louisiana who found them, as one planter wrote, “a hard-working, money-saving race, and content with … few of the comforts of life.” By the 1880s and 1890s, Sicilians flooded into the port of New Orleans and dominated Italian immigration into Louisiana: over 80 percent of the Italian immigrants who arrived in New Orleans were Sicilian. Some stayed. By 1900, the city had the largest Italian community in the South; about 20,000 (counting the children of immigrants) lived in New Orleans. But most left to labor on the sugar cane and cotton plantations, an arduous life that nevertheless gave them the chance to save money. An immigrant who carefully hoarded his wages could strike out on his own within a few years. As far as the planters were concerned, this was the one problem with Italian workers. Planters grumbled that they couldn’t keep Italians in the field because in a couple of years they would have “laid by a little money and are ready to start a fruit shop or grocery store at some cross-roads town.” By 1900, small Italian-owned businesses had sprung up all over Louisiana. But the commercial success of Sicilian immigrants couldn’t protect them from the racial prejudices of the American South. Italians never entirely replaced black labor in Louisiana but worked alongside African-Americans in the fields. While Italians, not understanding the racial hierarchies of the South, found nothing shameful about this, for native whites their willingness to do so made them no better than “Negroes,” Chinese, or other “non-white” groups. The swarthy Sicilians were often considered not white at all, nothing but “black dagoes.” It wasn’t lost on a contemporary observer that even African-American laborers distinguished between whites and Italians and treated their fellow workers with, as one he described it, “a sometimes contemptuous, sometimes friendly, first-name familiarity” they would never have dared employ with other whites. The notion that “dagoes” were no better than “Negroes” helps account for growing prejudice against Italian immigrants in the 1870s and 1880s. They faced suspicion and the occasional lynch mob. In 1929, a New Orleans judge expressed a common view of most Sicilians in New Orleans as “of a thoroughly undesirable character, being largely composed of the most vicious, ignorant, degraded and filthy paupers, with something more than an admixture of the criminal element.” In New Orleans, the French Quarter, the oldest section of the city filled with decrepit Creole townhouses, had become the Italian neighborhood. By the early 20th century, so many Sicilians congregated in the lower French Quarter near the river that the area from Jackson Square to Esplanade Avenue, between Decatur and Chartres, was known as “Little Palermo.” One of the most common upward trajectories for an ambitious Sicilian in New Orleans and elsewhere was that from plantation worker to truck farmer and peddler to grocer. By the early 20th century, Italians were taking over the corner grocery business. They owned only 7 percent of grocery stores in New Orleans in 1880. By 1900, 19 percent were Italian-owned, and by 1920 they ran fully half of all groceries in the city. Some Italians did very well indeed in New Orleans: After laboring on the sugar cane plantations, Joseph Vaccaro peddled fruit from a mule-drawn cart. He later used a fruit stall in the New Orleans French Market to launch his wholesale business and eventually made his fortune importing oranges and bananas. Giuseppe Uddo began his career hawking olive oil and cheese from a horse-drawn cart before founding Progresso Food Products. Despite such successes, unpleasant stereotypes clung to Italian immigrants, some of which had a basis in reality. The Sicilians brought with them to America a clannishness and distrust of the authorities that led them to settle their disputes the old-fashioned way: the vendetta. This system of justice survived in Sicily into the 20th century; immigrants brought it with them to New Orleans, and vendettas, both personal and professional, weren’t particularly uncommon. So many shootings and knife fights occurred along Decatur Street that it was nicknamed “Vendetta Alley.” The fear of immigrant crime culminated in 1890-1891 with the murder of New Orleans Chief of Police David Hennessy. The popular official was met with a volley of shotgun fire as he arrived home on the night of October 15, 1890. Mortally wounded, Hennessy insisted, “The dagos got me.” He’d previously been involved in a violent dispute between two Italian factions, the Provenzanos and the Matrangas. New Orleanians found it easy to believe that Hennessy’s murder was connected to the feud and that organized Italian criminal gangs the press often referred to as “the Mafia” were responsible. The police arrested a number of Sicilians, who were to be tried in two groups. After an initial set of acquittals, a mob stormed the jail, murdering 11 of the accused. They lynched some who’d been acquitted, as well as some who had yet to be tried. Criminal Italian gangs were certainly active in New Orleans, although as crime historian Humbert S. Nelli has pointed out, their criminal activity “could not accurately be ascribed to Mafiosi.” Historian Robert M. Lombardo has explained that, “the Mafia was not a secret criminal organization but a form of social organization that developed in Sicily and the south of Italy under very specific circumstances.” It was, he notes, “a form of behavior and a kind of power, not a formal organization.” On the other hand, a type of petty extortion known as Black Hand crime—a practice rather than an organization—did exist in which the victim was threatened with violence if the money demanded wasn’t paid. Such crime was ubiquitous in southern Italian communities all over the U.S. by the late 19th and early 20th centuries, including New Orleans, and only disappeared when the descendants of immigrants became sufficiently Americanized to complain to the police. Citizens of New Orleans tended to conflate the vendetta, the Mafia, and the Black Hand, by the early 20th centuries using “Mafia” and “Black Hand” interchangeably, using both to refer to a formal criminal organization.  Given this history, it wasn’t entirely surprising when New Orleanians suspected that the attacks on Italian grocers might be connected to a vendetta or Black Hand blackmail attempts. However, New Orleans detective John Dantonio, a nationally known expert on the “Mafia,” rejected the idea, saying that a Black Hand attack wouldn’t have left any survivors as the Axeman frequently did. He agreed with Frank Mooney, New Orleans’ police superintendent, who was convinced that the attacks were the work of a “fiend,” “a Jekyll and Hyde personality, like Jack the Ripper. … [S]uddenly the impulse to kill comes upon him and he must obey it.” In other words, what we’d now call a serial killer. Despite Mooney and Dantonio’s view, when the Axeman attacked the Cortimiglias, the Gretna authorities could more easily accept a vendetta between two Italian businesses than they could the idea that a bloodthirsty “fiend” stalked the streets. Even some New Orleans police officers still believed that the vendetta could explain the Axeman murders. The Gretna officials had also had enough exposure to the Old World traditions of the Sicilian immigrants to have few qualms about manufacturing evidence against their “obvious” suspects; for this abuse of power no excuse exists. But for their ignorance of serial killers – at the time a novel concept – they cannot be blamed. And suspecting an Italian vendetta wasn’t entirely unreasonable in a period when disputes among Italian immigrants not infrequently resulted in assault or murder. A close examination of the attacks attributed to the Axeman shows that not all of these assaults were actually his handiwork. But someone was specifically targeting Italian grocers, both in 1917-1919, and in 1910-1911 when a similar spate of attacks occurred. According to eyewitness accounts of survivors, the Axeman was a white working-class male in his 30s when the attacks began. From the ease with which he broke into the groceries and his use of a railroad shoe pin, a common burglary tool, the police concluded that he was an experienced burglar. The Axeman vanished from New Orleans after the attack on the Cortimiglias. (The murder of Mike Pepitone in August 1919, while sometimes attributed to the Axeman, actually appears to have been part of a longstanding vendetta.) Evidence from police records and newspaper accounts, however, show that he struck elsewhere in Louisiana, killing Joseph Spero and his daughter in Alexandria in December 1920, Giovanni Orlando in DeRidder in January 1921, and Frank Scalisi in Lake Charles in April 1921. The killer’s modus operandus was the same: breaking into an Italian grocery in the middle of the night and attacking the grocer and his family with their own axe. The Axeman then disappeared from history. The Italians of New Orleans didn’t. They continued to prosper. Although as a result of the growth of supermarkets, the corner groceries eventually disappeared, they, like so many immigrants before them, joined mainstream American society while continuing to maintain their own ethnic identity. Miriam C. Davis is the author of The Axeman of New Orleans: The True Story.
206f673a7d519f663303042196d445d8
https://www.smithsonianmag.com/history/beach-lady-84237022/
Beach Lady
Beach Lady When I telephoned my friend MaVynee Betsch in American Beach, Florida, recently, I got her answering machine. It would be hard to overstate my amazement. An answering machine! In all the years I’ve known MaVynee, she’s never even had a home telephone. Actually, for many of those years, she didn’t have a home. She resided sporadically in a donated trailer or in loaned basement rooms, but primarily (and willfully) on a chaise longue on the beach. Now, at the firm insistence of family and friends, she has moved into a small apartment, got herself listed with directory assistance and given up her nomad ways. Or maybe not. "Hello," said the voice on the tape. "This is the Beach Lady. If you’re getting this message, it may be because I have turned into a butterfly and floated out over the sand dune." That’s MaVynee: defying gravity, determinedly whimsical in the face of adversity and diminished fortunes. She was not always a hermit at the beach. She was raised in one of the preeminent black families in the South and was educated at the Oberlin Conservatory of Music in Ohio. She studied voice in Paris and London, and sang opera throughout Germany during the mid- 1950s and early ’60s in concert halls where she is still remembered four decades after she quit her glamorous career because she felt herself called home to Florida. She jettisoned more than her diva status. She also gave away her significant inheritance, writing checks to conservation causes until the money ran out in the late 1970s, with the intangible compensation that a textbook on butterflies is dedicated to her and an Atlantic-traveling whale has been given her name (MaVynee #1151) by biologists at Boston’s New England Aquarium. If MaVynee does indeed decide to float off as a butterfly, she certainly won’t lack for credentials. In spring 2002, MaVynee was diagnosed with cancer, and surgeons removed her stomach. That triggered her family’s insistence that she finally move indoors. In the fall came worse news: her cancer had recurred and spread, and doctors said she might have only months to live. That’s why I was calling. When MaVynee heard my voice, she picked up the phone (MaVynee, already screening her calls!), but she didn’t want to linger on her health. She wanted to discuss her plans. MaVynee intends to start a museum. The institution MaVynee envisions will contain the history of American Beach, the town where she’s lived many of her 68 years. American Beach is on Amelia Island, nearly 40 miles north of downtown Jacksonville on the Atlantic Coast. It was built in the 1930s by Florida’s first insurance company, the Afro-American Life Insurance Company, at the behest of its president, Abraham Lincoln Lewis, Florida’s first black millionaire. For decades it flourished as an ocean-side paradise for blacks from around the country, who admittedly had little choice. "When we were children, could we go to the beach just anywhere we wanted?" MaVynee asks the college kids who come through town on buses for history tours. "Uh-uh. No...way...José!" Her voice is as cultured, worldly and refined as you’d expect a former opera star’s to be, and her carriage so regal that when she sits on her busted plastic beach chair on the borrowed sundeck of Abraham Lincoln Lewis’ old home (the oldest house on the beach), you would think that she owned the place. Which in a way she does: A. L. Lewis was her great-grandfather. Many of those visiting the Beach in its heyday were likewise illustrious—writer Zora Neale Hurston, heavyweight champion Joe Louis, entertainer Cab Calloway and civil rights leader A. Philip Randolph among them. But most were ordinary working-class African-Americans who came to enjoy (as the Beach’s advertisements phrased it) "relaxation and recreation without humiliation." The town retains even today that democratic mix. It is the home of one of the first black graduates of Mount Holyoke and the first black Florida supreme court justice since Reconstruction. And it is also the home of ordinary folks. "See that house?" MaVynee asks visitors. "A maid lives there. And a postman lives over there. Where else in America do maids own beach homes?" American Beach was born in a time when black life was dominated by the strictures of Jim Crow. Shut out from the white economy, African-Americans created their own, and in Philadelphia and Atlanta and Los Angeles and most other major American cities, they lived and shopped in a separate universe parallel to the white one nearby. Jacksonville had its own thriving black stores and restaurants, factories, newspapers, banks, insurance companies and hospitals and, as a direct consequence, its own black professional establishment. If that establishment was wealthy and educated, it was also invisible to most whites, who tended to think of black people as entertainers, criminals or "the help." The black middle class even vacationed out of white sight, in resorts like Oak Bluffs on Martha’s Vineyard and Val Verde outside Los Angeles. And American Beach. Most of those places have languished—after the demise of segregation, they weren’t needed the way they once had been, and the businesses that created and fostered them closed as well. The Afro-American Life Insurance Company shut its doors in 1991, and what’s left of American Beach, with fewer than 25 year-round families, doesn’t even make an appearance on many Florida maps. Most of its homes are aging and modest; a few of the grandest have been torn down. And its businesses—the nightclubs, hotels and restaurants that used to throb with activity all summer night—are boarded up. There are many who think American Beach will not be around much longer, considering the pressure from rich developers. Eight years ago, a large section of property that had once belonged to the Beach, including a giant sand dune that dominates the town, was sold to Amelia Island Plantation, one of the multimillion-dollar golf and vacation resorts that are American Beach’s neighbors. MaVynee vehemently opposed the sale—we are talking, after all, about the same dune over which she envisions flapping her butterfly wings. She calls it NaNa and grieved its loss as though the dune were a member of her family. The resort preserved it and built a golf course on much of the land behind it. If this all makes the idea of an American Beach museum seem quixotic, add the melancholy fact that the museum’s main advocate is herself a veritable pauper. MaVynee’s minimal rent is paid by her sister in North Carolina and her medical bills by Social Security. Friends pony up for her pharmacy and phone bills. But those who know her know never to bet against her. In whatever celestial gambling den museum futures are traded, the museum at American Beach may be listed as a long shot. But the smart money’s on the Beach Lady. After all, MaVynee has a way of beating the odds. Case in point: NaNa. This year, Amelia Island Plantation, MaVynee’s old antagonist, made arrangements to transfer the sand dune, in MaVynee’s honor, to the National Park Service. MaVynee’s friends wanted to present the news to her as a surprise on her birthday this past January 13, but they discovered that the transfer required, literally, an act of Congress. Now, Representative Ander Crenshaw and Senator Bill Nelson, both of Florida, have come to the rescue; they are introducing the necessary legislation. The schoolchildren of American Beach have a theory about MaVynee’s magical ability to prevail—they whisper that she’s a shaman or a witch. Their evidence is her appearance: her fingernails are very long—until they got clipped in the hospital, those on her left hand spiraled to more than a foot and a half. Her hair, coiffed into a wheel over her head, cascades in graying dreadlocks down her back and past her ankles. Her hair and clothes are festooned with political buttons, unfailingly radical and generally funny, most expressing her commitment to social and racial justice, ecological causes and vegetarianism. Her colorfulness acts as a mighty come-on, especially for children. "They come to see my hair," MaVynee says mischievously, "and I give ’em a little history." It’s a history that’s been lost to the larger world and even to the younger generation of blacks. The museum MaVynee envisions would reverse that invisibility and highlight the culture of Abraham Lincoln Lewis’ generation. "It’s awesome," MaVynee says, "how they stuck together and created a world without outside help." The message transcends the artificial boundary of "black history," she says. In this era of corporate scandal, Americans are debating the obligations of the business world and its leaders to society. No group has confronted those questions more directly than did the black businessmen of A. L. Lewis’ generation, who felt an explicit obligation to "uplift" their community. Herself a vivid relic of that great history, MaVynee has collected many other relics to start her museum: old license plate holders that advertise "Negro Ocean Playground," Afro-American Life Insurance Company ashtrays that vow "A Relief in Distress," and a wealth of papers, including 19th-century land deeds and stock certificates and such manuscripts as A. L. Lewis’ speech before Booker T. Washington’s National Negro Business League. For years MaVynee kept her stash in milk crates, stored out of the rain in her various way stations. She hopes that a formal repository for such treasures will encourage others who experienced the Beach’s history to contribute their keepsakes and records. Prospects for the museum at American Beach are looking rosy. The county is providing a room in a new community center on the outskirts of town. A committee that includes historians and museum directors hopes to expand MaVynee’s trove and to raise $500,000 in funds. Says Rowena Stewart, former executive director of the American Jazz Museum in Kansas City: "We are planning for photographs, signs, posters, clothing of the period—any artifacts we can use to re-create, in this small space, the experience of being at the Beach during the time when its role was so crucial. And we are tape-recording the recollections of the early residents for an oral history archive." "I know I’m blessed," MaVynee says, "because anytime anything bad happens to me, something good comes out of it. I swear sometimes I think my great-grandfather is looking out for me." He may be at that. MaVynee’s most recent checkup showed the fast-moving cancer stalled in its tracks, and a mystified physician told her that if she keeps on like this, he’ll have to revise his prognosis. She’s beating the odds once again, it seems, and her many friends hope that her floating butterfly days are far ahead of her.
f5bdbf311cecd4cbb6ff59c2c6f3e25c
https://www.smithsonianmag.com/history/becoming-anne-frank-180970542/?src=longreads
People love dead Jews. Living Jews, not so much. This disturbing idea was suggested by an incident this past spring at the Anne Frank House, the blockbuster Amsterdam museum built out of Frank’s “Secret Annex,” or in Dutch, “Het Achterhuis [The House Behind],” a series of tiny hidden rooms where the teenage Jewish diarist lived with her family and four other persecuted Jews for over two years, before being captured by Nazis and deported to Auschwitz in 1944. Here’s how much people love dead Jews: Anne Frank’s diary, first published in Dutch in 1947 via her surviving father, Otto Frank, has been translated into 70 languages and has sold over 30 million copies worldwide, and the Anne Frank House now hosts well over a million visitors each year, with reserved tickets selling out months in advance. But when a young employee at the Anne Frank House in 2017 tried to wear his yarmulke to work, his employers told him to hide it under a baseball cap. The museum’s managing director told newspapers that a live Jew in a yarmulke might “interfere” with the museum’s “independent position.” The museum finally relented after deliberating for six months, which seems like a rather long time for the Anne Frank House to ponder whether it was a good idea to force a Jew into hiding. One could call this a simple mistake, except that it echoed a similar incident the previous year, when visitors noticed a discrepancy in the museum’s audioguide displays. Each audioguide language was represented by a national flag—with the exception of Hebrew, which was represented only by the language’s name in its alphabet. The display was eventually corrected to include the Israeli flag. These public relations mishaps, clumsy though they may have been, were not really mistakes, nor even the fault of the museum alone. On the contrary, the runaway success of Anne Frank’s diary depended on playing down her Jewish identity: At least two direct references to Hanukkah were edited out of the diary when it was originally published. Concealment was central to the psychological legacy of Anne Frank’s parents and grandparents, German Jews for whom the price of admission to Western society was assimilation, hiding what made them different by accommodating and ingratiating themselves to the culture that had ultimately sought to destroy them. That price lies at the heart of Anne Frank’s endless appeal. After all, Anne Frank had to hide her identity so much that she was forced to spend two years in a closet rather than breathe in public. And that closet, hiding place for a dead Jewish girl, is what millions of visitors want to see. * * * Surely there is nothing left to say about Anne Frank, except that there is everything left to say about her: all the books she never lived to write. For she was unquestionably a talented writer, possessed of both the ability and the commitment that real literature requires. Quite the opposite of how an influential Dutch historian described her work in the article that spurred her diary’s publication—a “diary by a child, this de profundis stammered out in a child’s voice”—Frank’s diary was not the work of a naif, but rather of a writer already planning future publication. Frank had begun the diary casually, but later sensed its potential; upon hearing a radio broadcast in March of 1944 calling on Dutch civilians to preserve diaries and other personal wartime documents, she immediately began to revise two years of previous entries, with a title (Het Achterhuis, or The House Behind) already in mind, along with pseudonyms for the hiding place’s residents. Nor were her revisions simple corrections or substitutions. They were thoughtful edits designed to draw the reader in, intentional and sophisticated. Her first entry in the original diary, for instance, begins with a long description of her birthday gifts (the blank diary being one of them), an entirely unself-conscious record by a 13-year-old girl. The first entry in her revised version, on the other hand, begins with a deeply self-aware and ironic pose: “It’s an odd idea for someone like me to keep a diary; not only because I have never done so before, but because it seems to me that neither I—nor for that matter anyone else—will be interested in the unbosomings of a 13-year-old schoolgirl.” The innocence here is all affect, carefully achieved. Imagine writing this as your second draft, with a clear vision of a published manuscript, and you have placed yourself not in the mind of a “stammering” child, but in the mind of someone already thinking like a writer. In addition to the diary, Frank also worked hard on her stories, or as she proudly put it, “my pen-children are piling up.” Some of these were scenes from her life in hiding, but others were entirely invented: stories of a poor girl with six siblings, or a dead grandmother protecting her orphaned grandchild, or a novel-in-progress about star-crossed lovers featuring multiple marriages, depression, a suicide and prophetic dreams. (Already wary of a writer’s pitfalls, she insisted the story “isn’t sentimental nonsense for it’s modeled on the story of Daddy’s life.”) “I am the best and sharpest critic of my own work,” she wrote a few months before her arrest. “I know myself what is and what is not well written.” What is and what is not well written: It is likely that Frank’s opinions on the subject would have evolved if she had had the opportunity to age. Reading the diary as an adult, one sees the limitations of a teenager’s perspective, and longs for more. In one entry, Frank describes how her father’s business partners—now her family’s protectors—hold a critical corporate meeting in the office below the family’s hiding place. Her father, she and her sister discover that they can hear what is said by lying down with their ears pressed to the floor. In Frank’s telling, the episode is a comic one; she gets so bored that she falls asleep. But adult readers cannot help but ache for her father, a man who clawed his way out of bankruptcy to build a business now stolen from him, reduced to lying face-down on the floor just to overhear what his subordinates might do with his life’s work. When Anne Frank complains about her insufferable middle-aged roommate Fritz Pfeffer (Albert Dussel, per Frank’s pseudonym) taking his time on the toilet, adult readers might empathize with him as the only single adult in the group, permanently separated from his non-Jewish life partner whom he could not marry due to anti-Semitic laws. Readers Frank’s age connect with her budding romance with fellow hidden resident Peter van Pels (renamed Peter van Daan), but adults might wonder how either of the married couples in the hiding place managed their own relationships in confinement with their children. Readers Frank’s age relate to her constant complaints about grown-ups and their pettiness, but adult readers are equipped to appreciate the psychological devastation of Frank’s older subjects, how they endured not only their physical deprivation, but the greater blow of being reduced to a childlike dependence on the whims of others. Frank herself sensed the limits of the adults around her, writing critically of her own mother’s and Peter’s mother’s apparently trivial preoccupations—and in fact these women’s prewar lives as housewives were a chief driver for Frank’s ambitions. “I can’t imagine that I would have to lead the same sort of life as Mummy and Mrs. v.P. [van Pels] and all the women who do their work and are then forgotten,” she wrote as she planned her future career. “I must have something besides a husband and children, something that I can devote myself to!” In the published diary, this passage is immediately followed by the famous words, “I want to go on living even after my death!” By plastering this sentence on Frank’s book jackets, publishers have implied that her posthumous fame represented the fulfillment of the writer’s dream. But when we consider the writer’s actual ambitions, it is obvious that her dreams were in fact destroyed—and it is equally obvious that the writer who would have emerged from Frank’s experience would not be anything like the writer Frank herself originally planned to become. Consider, if you will, the following imaginary obituary of a life unlived: Anne Frank, noted Dutch novelist and essayist, died Wednesday at her home in Amsterdam. She was 89. A survivor of Auschwitz and Bergen-Belsen, Frank achieved a measure of fame that was hard won. In her 20s she struggled to find a publisher for her first book, "The House Behind." The two-part memoir consisted of a short first section detailing her family’s life in hiding in Amsterdam, followed by a much longer and more gripping account of her experiences at Auschwitz, where her mother and others who had hidden with her family were murdered, and later at Bergen-Belsen, where she witnessed her sister Margot’s horrific death. Disfigured by a brutal beating, Frank rarely granted interviews; her later work, "The Return," describes how her father did not recognize her upon their reunion in 1945. "The House Behind" was searing and accusatory: The family’s initial hiding place, mundane and literal in the first section, is revealed in the second part to be a metaphor for European civilization, whose facade of high culture concealed a demonic evil. “Every flat, every house, every office building in every city,” she wrote, “they all have a House Behind.” The book drew respectful reviews, but sold few copies. She supported herself as a journalist, and in 1961 traveled to Israel to cover the trial of Adolf Eichmann for the Dutch press. She earned special notoriety for her fierce reporting on the Nazi henchman’s capture, an extradition via kidnapping that the Argentine elite condemned. Frank soon found the traction to publish Margot, a novel that imagined her sister living the life she once dreamed of, as a midwife in the Galilee. A surreal work that breaks the boundaries between novel and memoir, and leaves ambiguous which of its characters are dead or alive, Margot became wildly popular in Israel. Its English translation allowed Frank to find a small but appreciative audience in the United States. Frank’s subsequent books and essays continued to win praise, if not popularity, earning her a reputation as a clear-eyed prophet carefully attuned to hypocrisy. Her readers will long remember the words she wrote in her diary at 15, included in the otherwise naive first section of "The House Behind": “I don’t believe that the big men are guilty of the war, oh no, the little man is just as guilty, otherwise the peoples of the world would have risen in revolt long ago! There’s in people simply an urge to destroy, an urge to kill, to murder and rage, and until all mankind without exception undergoes a great change, wars will be waged, everything that has been built up, cultivated and grown will be cut down and disfigured, and mankind will have to begin all over again.” Her last book, a memoir, was titled "To Begin Again." * * * The problem with this hypothetical, or any other hypothetical about Frank’s nonexistent adulthood, isn’t just the impossibility of knowing how her life and career might have developed. The problem is that the entire appeal of Anne Frank to the wider world—as opposed to those who knew and loved her—lies in her lack of a future. There is an exculpatory ease to embracing this “young girl,” whose murder is almost as convenient for her many enthusiastic readers as it was for her persecutors, who found unarmed Jewish children easier to kill off than the Allied infantry. After all, an Anne Frank who lived might have been a bit upset at the Dutch people who, according to the leading theory, turned in her household and received a reward of approximately $1.40 per Jew. An Anne Frank who lived might not have wanted to represent “the children of the world,” particularly since so much of her diary is preoccupied with a desperate plea to be taken seriously—to not be perceived as a child. Most of all, an Anne Frank who lived might have told people about what she saw at Westerbork, Auschwitz and Bergen-Belsen, and people might not have liked what she had to say. And here is the most devastating fact of Frank’s posthumous success, which leaves her real experience forever hidden: We know what she would have said, because other people have said it, and we don’t want to hear it. The line most often quoted from Frank’s diary—“In spite of everything, I still believe that people are really good at heart”—is often called “inspiring,” by which we mean that it flatters us. It makes us feel forgiven for those lapses of our civilization that allow for piles of murdered girls—and if those words came from a murdered girl, well, then, we must be absolved, because they must be true. That gift of grace and absolution from a murdered Jew (exactly the gift, it is worth noting, at the heart of Christianity) is what millions of people are so eager to find in Frank’s hiding place, in her writings, in her “legacy.” It is far more gratifying to believe that an innocent dead girl has offered us grace than to recognize the obvious: Frank wrote about people being “truly good at heart” three weeks before she met people who weren’t. Here’s how much some people dislike living Jews: They murdered six million of them. Anne Frank’s writings do not describe this process. Readers know that the author was a victim of genocide, but that does not mean they are reading a work about genocide. If that were her subject, it is unlikely that those writings would have been universally embraced. We know this because there is no shortage of texts from victims and survivors who chronicled the fact in vivid detail, and none of those documents has achieved anything like the fame of Frank’s diary. Those that have come close have only done so by observing the same rules of hiding, the ones that insist on polite victims who don’t insult their persecutors. The work that came closest to achieving Frank’s international fame might be Elie Wiesel’s Night, a memoir that could be thought of as a continuation of Frank’s experience, recounting the tortures of a 15-year-old imprisoned in Auschwitz. As the scholar Naomi Seidman has discussed, Wiesel first published his memoir in Yiddish, under the title And the World Kept Silent. The Yiddish book told the same story, but it exploded with rage against his family’s murderers and, as the title implies, the entire world whose indifference (or active hatred) made those murders possible. With the help of the French Catholic Nobel laureate François Mauriac, Wiesel later published a French version of the book under the title Night—a work that repositioned the young survivor’s rage into theological angst. After all, what reader would want to hear about how his society had failed, how he was guilty? Better to blame God. This approach did earn Wiesel a Nobel Peace Prize, as well as a spot in Oprah’s Book Club, the American epitome of grace. It did not, however, make teenage girls read his book in Japan, the way they read Frank’s. For that he would have had to hide much, much more. What would it mean for a writer not to hide the horror? There is no mystery here, only a lack of interest. To understand what we are missing, consider the work of another young murdered Jewish chronicler of the same moment, Zalmen Gradowski. Like Frank’s, Gradowski’s work was written under duress and discovered only after his death—except that Gradowski’s work was written in Auschwitz, and you have probably never heard of it. Gradowski was one of the Jewish prisoners in Auschwitz’s Sonderkommando: those forced to escort new arrivals into the gas chambers, haul the newly dead bodies to the crematoriums, extract any gold teeth and then burn the corpses. Gradowski, a young married man whose entire family was murdered, reportedly maintained his religious faith, reciting the kaddish (mourner’s prayer) each evening for the victims of each transport—including Peter van Pels’ father, who was gassed a few weeks after his arrival in Auschwitz on September 6, 1944. Gradowski recorded his experiences in Yiddish in documents he buried, which were discovered after the war; he himself was killed on October 7, 1944, in a Sonderkommando revolt that lasted only one day. (The documents written by Gradowski and several other prisoners inspired the 2015 Hungarian film Son of Saul, which, unsurprisingly, was no blockbuster, despite an Academy Award and critical acclaim.) “I don’t want to have lived for nothing like most people,” Frank wrote in her diary. “I want to be useful or give pleasure to the people around me who don’t yet know me, I want to go on living even after my death!” Gradowski, too, wrote with a purpose. But Gradowski’s goal wasn’t personal or public fulfillment. His was truth: searing, blinding prophecy, Jeremiah lamenting a world aflame. “It may be that these, the lines that I am now writing, will be the sole witness to what was my life,” Gradowski writes. “But I shall be happy if only my writings should reach you, citizen of the free world. Perhaps a spark of my inner fire will ignite in you, and even should you sense only part of what we lived for, you will be compelled to avenge us—avenge our deaths! Dear discoverer of these writings! I have a request of you: This is the real reason why I write, that my doomed life may attain some meaning, that my hellish days and hopeless tomorrows may find a purpose in the future.” And then Gradowski tells us what he has seen. Gradowski’s chronicle walks us, step by devastating step, through the murders of 5,000 people, a single large “transport” of Czech Jews who were slaughtered on the night of March 8, 1944—a group that was unusual only because they had already been detained in Birkenau for months, and therefore knew what was coming. Gradowski tells us how he escorted the thousands of women and young children into the disrobing room, marveling at how “these same women who now pulsed with life would lie in dirt and filth, their pure bodies smeared with human excrement.” He describes how the mothers kiss their children’s limbs, how sisters clutch each other, how one woman asks him, “Say, brother, how long does it take to die? Is it easy or hard?” Once the women are naked, Gradowski and his fellow prisoners escort them through a gantlet of SS officers who had gathered for this special occasion—a night gassing arranged intentionally on the eve of Purim, the biblical festival celebrating the Jews’ narrow escape from a planned genocide. He recalls how one woman, “a lovely blond girl,” stopped in her death march to address the officers: “‘Wretched murderers! You look at me with your thirsty, bestial eyes. You glut yourselves on my nakedness. Yes, this is what you’ve been waiting for. In your civilian lives you could never even have dreamed about it. [...] But you won’t enjoy this for long. Your game’s almost over, you can’t kill all the Jews. And you will pay for it all.’ And suddenly she leaped at them and struck Oberscharführer Voss, the director of the crematoriums, three times. Clubs came down on her head and shoulders. She entered the bunker with her head covered with wounds [...] she laughed for joy and proceeded calmly to her death.” Gradowski describes how people sang in the gas chambers, songs that included Hatikvah, “The Hope,” now the national anthem of Israel. And then he describes the mountain of open-eyed naked bodies that he and his fellow prisoners must pull apart and burn: “Their gazes were fixed, their bodies motionless. In the deadened, stagnant stillness there was only a hushed, barely audible noise—a sound of fluid seeping from the different orifices of the dead. [...] Frequently one recognizes an acquaintance.” In the specially constructed ovens, he tells us, the hair is first to catch fire, but “the head takes the longest to burn; two little blue flames flicker from the eyeholes—these are the eyes burning with the brain. [...] The entire process lasts 20 minutes—and a human being, a world, has been turned to ashes. [...] It won’t be long before the five thousand people, the five thousand worlds, will have been devoured by the flames.” Gradowski was not poetic; he was prophetic. He did not gaze into this inferno and ask why. He knew. Aware of both the long recurring arc of destruction in Jewish history, and of the universal fact of cruelty’s origins in feelings of worthlessness, he writes: “This fire was ignited long ago by the barbarians and murderers of the world, who had hoped to drive darkness from their brutal lives with its light.” One can only hope that we have the courage to hear this truth without hiding it, to face the fire and to begin again. This article is a selection from the November issue of Smithsonian magazine
d3033b8c7c9c47fa97058bc3fddb697c
https://www.smithsonianmag.com/history/before-rosie-the-riveter-farmerettes-went-to-work-141638628/
World War I: 100 Years Later
World War I: 100 Years Later From 1917 to 1919, the Woman's Land Army of America brought more than 20,000 city and town women to rural America to take over  farm work after men were called to war. Most of these women had never before worked on a farm, but they were soon plowing fields, driving tractors, planting and harvesting. The Land Army's "farmerettes" were paid wages equal to male farm laborers and were protected by an eight-hour workday. For many, the farmerettes were shocking at first--wearing pants!--but farmers began to rely upon the women workers. Inspired by the women of Great Britain, organized as the Land Lassies, the Woman’s Land Army of America was established by a consortium of women’s organizations--including gardening clubs, suffrage societies, women’s colleges, civic groups, and the YWCA. The WLA provided a fascinating example of women mobilizing themselves and challenged conventional thinking about gender roles. Like Rosie the Riveter a generation later, the Land Army farmerette became a wartime icon. The following excerpt from Fruits of Victory: The Woman’s Land Army in the Great War chronicles the farmerettes of the California division of the Woman’s Land Army. A brass band welcomed the first unit of the California Woman’s Land Army when it arrived in the town of Elsinore on the first of May, 1918. The whole community turned out to greet the fifteen women dressed in their stiff new uniforms. The Chamber of Commerce officials gave speeches of welcome, the Farm Bureau president thanked the “farmerettes” for coming, and the mayor gave them the keys to the city. The Land Army recruits drove the fifty miles from the WLA headquarters offices in downtown Los Angeles to Elsinore in style: the mayor had dispatched a truck to chauffeur them. At the welcoming ceremonies, Mayor Burnham apologized for the lack of an official municipal key ring, and offered instead a rake, hoe, and shovel to the farmerettes, “emblematic of their toil for patriotic defense.” The grateful citizens of Elsinore gave the farmerettes three loud cheers. While California fruit growers held lucrative contracts with the U.S. military to supply troops with dried and canned fruit, the extreme wartime farm labor shortage enabled the California Woman’s Land Army to demand extraordinary employment terms: a guaranteed contract, equal pay to what local male farm laborers could command, an eight hour day, and overtime pay. The employers also agreed to worker protections--comfortable living quarters, designated rest periods, lifting limits, and workers’ compensation insurance—considered radical for the time. The Los Angeles Times trumpeted the arrival of the “Great Land Army” in Elsinore as an “Epochal Experiment” and proclaimed the farmerettes were “To Turn New Earth in History of the American Woman.” Photographs of the farmerettes’ first day at work, handling horse-drawn cultivators and gangplows, or at the wheel of giant tractors, were spread across the pages of the state’s newspapers. Asked if the strenuous labor might prove too hard, and some of the farmerettes might give up after a short stint, the recruits denied that was even possible. “Would we quit?” one farmerette told a reporter, “No, soldiers don’t.” Idella Purnell didn’t lie about her age in order join the Northern California division of the WLA, which opened its San Francisco headquarters just a week later. She didn’t need to. The daughter of American parents, Idella was raised in Mexico but came north in preparation for entering university at Berkeley that fall. As a patriotic gesture, she wanted to serve in the Land Army in the summer months, but she was only seventeen years old, a year shy of the official entrance age. She passed her physical at headquarters, “and as I am ‘husky’ they decided to let my youth go unnoticed and simply make me 18!” Purnell confided, after the fact. The San Francisco recruiting officers were willing the bend the rules as they faced the prospect of trying to fill their large quotas; requests for more farmerettes were pouring in daily. “This is the recruiting slogan of the Women’s Land Army of America,” reported one San Francisco area newspaper: “Joan of Arc Left the Soil to Save France. We’re Going Back to the Soil to Save America.” An “advanced guard” of women, mostly Berkeley students, was sent to the University of California’s agricultural farm at Davis for training and soon proved themselves “extremely efficient and as capable as men workers.” Another unit was based in the dormitories of Stanford and worked the crops of the Santa Clara Valley in WLA uniform. Sacramento set up a district WLA office, and more than 175 women enlisted for service in the first month. “Up in Sacramento they are nearly as proud of the WLA as of the new aviation field,” reported the San Francisco Examiner. “In both cases justification lies in actual achievement…the WLA shows that the women and girls are serious…and mean to do their bits.” In mid- June on the eve of their deployment, twenty-four fresh recruits gathered in the San Francisco WLA headquarters, located in the Underwood Building on Market Street. They were the first group assigned to the brand new farmerette camp at Vacaville, and they were summoned together for a pre-departure pep talk. The Vacaville Camp was constructed and furnished by a consortium of local fruit growers, who paid for it out of their own pockets. They built the camp on high ground near the Vacaville train station, with a six-foot high pine stockade surrounding it for privacy. Inside the stockade were canvas sleeping tents with wood floors, a screened kitchen and dining room, showers, and a dressing room, as well as a hospital tent. The camp cost about $4,500 to build and the growers agreed to share the investment: only those who contributed towards the camp were to enjoy the assistance of the farmerettes. These farmerettes now assembled in the San Francisco WLA office, listening as their supervisor, Alice Graydon Phillips, explained what their life and work would be like in the Vacaville Camp. She warned them that the summer heat would be brutal, and that picking fruit atop ladders would make their backs, arms, and fingers sore. She read them the Woman’s Land Army pledge and then asked aloud if they willingly would arise to the sound of a bugle at 5:30 in the morning? “Yes!” they shouted. Would they consent to the WLA military-style structure? “Yes,” they agreed in unison. Would they agree to muster for inspection, line up for exercise drills, take kitchen police duty, and eat the rations they were served without complaint? “Yes!” Would they submit to strict rules of discipline—including the provision that five offenses for lateness constitutes one breach of discipline and an honorable discharge? Here the “Yes” chorus was punctuated by some sighs, but they assented.. They signed the pledge forms. They elected two “majors” from their ranks to lead them—one, a girl who had four brothers fighting at the front; the other, an older woman from Santa Barbara with girl-club experience. Led by a college girl from Berkeley, they all joined in a rousing cheer: Don’t be a slacker Be a picker or a packer WLA, Rah, rah, rah! They took the early train to Vacaville, just beyond Napa, a journey of about sixty miles. “It was hot in the orchard at Napa,” Idella Purnell recalled. The sun rose higher and higher, and the long ladders grew heavier and heavier. Perspiration started on our foreheads and beaded our lips. The golden peaches were so high—so hard to reach! The peach fuzz and dust on our throats and arms began to irritate the skin, but we did not dare scratch—we knew that would only aggravate the trouble. One who has never had “peach fuzz rash” cannot appreciate the misery of those toiling, dusty, hot-faced girls. Purnell, who would make her career as a writer and editor of an influential poetry journal, was getting a crash course in the less romantic aspects of farmerette life. As word of their good work spread, more northern and southern California farmers asked for WLA units to be based near their orchards and ranches. The newspapers charted the farmerettes’ summons into the golden groves with headlines like: “Hundreds Go Into Fields at Once” and “Women to Till Thousands of Southern California’s Acres.” Sunset magazine carried an editorial in its July issue titled “The Woman’s Land Army is Winning” illustrated by a photo of farmerettes in uniform posing with hoes slung over their shoulders like guns. The Los Angeles Times sent one of its star reporters, Alma Whitaker, to spend a day working with a Land Army unit, and she came away rather dazzled. Describing one farmerette as “tall and husky and wields a spade like a young Amazon her sword” and another as possessing “a pair of shoulders and muscular arms like a bantam lightweight” Whitaker was taken with the farmerettes’ serious attitude: “This woman’s land army, composed of able-bodied young women, selected just as the men are selected by the army, for their physical capacity, their good characters, their general deportment, and trained and disciplined even rather more strictly than the men... are acquitting themselves with amazing efficiency.” Whitaker took note of the Land Army uniform, which became a hot topic of conversation in that summer: “The official uniform has called forth criticism,” she reported. “Farm laborers don’t wear uniforms. But those uniforms are proven to be an essential and desirable asset, for not only are they intensely practical, but they have exactly the same effect on girls as they do on the men—one lives up to a uniform.” As in the military, the Land Army uniform also served as a great social equalizer and provided a powerful sense of social cohesion. “The cotton uniform,” wrote one California farmerette, “soon muddy and fruit stained, in which some girls looked picturesque, but no one overwhelmingly beautiful, leveled all distinction except those of personality, manners and speech.” As the season progressed, Idella Purnell was promoted to the captaincy of her own squad of Land Army workers. But amid the grape vines of Lodi, captain Purnell encountered what every American feared in this time of war: the snake in the garden, the saboteur. At first Purnell assumed the woman was simply that lesser form of wartime menace, the slacker, not willing to do her share, but Purnell’s suspicions hardened when her lazy farmerette resorted to shoddy picking: “She took to sabotage,” Purnell explained. “Green grapes, rotten grapes—anything and everything went into her boxes, tossed there by a hand careless of the precious bloom—and they all were only half full. Purnell tried to handle the situation herself: I remonstrated—mildly at first. I showed her again…At noon I made a special talk to the girls for her benefit, in which I pointed out that we were soldiers just as much as the ones ‘over there,’ that we too had a chance to make good—or to be classified as slackers and cowards. I made it clear that a slacker was a person who tried to palm off poor boxes of grapes for good ones. One bad bunch ruins a whole box, and that is the same as helping shoot cannonballs at our boys. But the slacker farmerette did not improve: “In fact, she seemed to take a malicious delight in doing her worst, and trying to get away with it,” said Purnell. “I argued, pleaded, threatened and scolded by turns. Commanding did no good. “That night I made a report to the camp supervisor, and learned that mine was not the first complaint against her. Mine was the last straw, and she was dishonorably discharged.” A saboteur farmerette in the ranks was exceedingly rare; more often the Land Army worker was hailed as the “Patriot Farmerette.” And in that role, she deserved a “pin-up” above her cot, a photo of a handsome movie star to inspire her, just like her brother in the army or navy had his starlets, teased L A Times reporter Alma Whitaker, who archly exhorted the local movie industry’s matinee idols to do their bit by becoming “godfathers” to farmerettes and other women war workers: Now, while our masculine regiments are well supplied with fair godmothers, not a single godfather has arisen for the benefit of the land army girls or the war efficiency motor maids or the Red Cross chapter girls… It isn’t fair. What are the stylish picture heroes thinking about? Why isn’t Charlie Chaplin or Douglas Fairbanks offering themselves in this guise? Is masculinity trying to assert, in this day and age, that women’s patriotism is not as important and self-sacrificing as men’s patriotism? Pshaw! Think of the land army girls, exuding honest sweat on California farms, day in and day out, in uniforms quite as becoming as any at Camp Kearny…all without a godfather. It would be such a nice compliment if, say, Charlie Chaplin should adopt the first unit of the woman’s land army and go down to see them decked in a land army uniform, just as Mary Pickford wore khaki when she went to San Diego. There are no known photos of Charlie Chaplin donning a Land Army uniform, but the farmerette was truly a star in California in the summer of 1918.
940e1b60bdaec815d5a936686a08bcbe
https://www.smithsonianmag.com/history/below-the-rim-119011046/
Below the Rim
Below the Rim It was early May, but a raw breeze was blowing as we tracked bootprints through an inch of new-fallen snow. Shortly after dawn, we had parked on the Desert View Drive and set off through the ponderosa forest toward the Grand Canyon, leaving behind the tourist traffic hurtling along the canyon’s South Rim. After hiking a mile, the three of us—mountaineer Greg Child, photographer Bill Hatcher and I—emerged abruptly from the trees to stand on a limestone promontory overlooking the colossal chasm. The view was predictably sublime—distant ridges and towers blurred to pastel silhouettes by the morning haze; the North Rim, 20 miles distant, smothered in storm; the turgid flood of the Colorado River silenced by the 4,800-foot void beneath our feet. But we hadn’t come for the scenery. We scrambled off the point, slithering among boulders as we lost altitude. A few hundred feet below the rim we were stopped by a band of rock that dropped nearly ten feet. We tied a rope to a clump of serviceberry bushes and slid down it, leaving the rope in place for our return. We had found our way through the canyon’s Kaibab Limestone cap rock and alighted atop a 400-foot precipice of Coconino Sandstone. For miles on either side, this band of grayish orange rock was too sheer to descend, but the prow itself was broken into sharp-angled steps. We took the line of least resistance, sidling around towers and straddling grooves, with the emptiness below our soles reminding us of the consequences of a misstep. Then the going got really tricky. We faced inward, moving slowly from one handhold and foothold to the next. All three of us are experienced climbers, but the terrain was as difficult as any of us dared tackle without ropes and hardware. Just as the “route” threatened to blank out, Greg, in the lead, placed his foot in a rounded hollow that gave him just enough purchase to keep his balance. Another hollow for his other foot—six in a row, all told. From years of prowling through the Southwest, we knew that these subtle depressions were man-made. More than seven centuries ago, some daring acrobat had pounded them with a rock harder than sandstone. So it went for the next 90 minutes: wherever the path seemed to vanish, early pioneers had stacked a platform of flat rocks here or carved a few footholds there. At last we came out onto a broad saddle between the plunging prow and an isolated butte to the north. As we sat eating lunch, we found red and gray and white flakes of chert scattered in the dirt—the debris of an arrowhead-making workshop. Bill looked up at the route we had just descended. Had we stumbled upon it from below, we might well have judged it unclimbable. “Pretty amazing, huh?” was all he could say. But what was the trail for, and what long-vanished culture had created it? The Grand Canyon occupies such an outsize place in the public imagination, we can be forgiven for thinking we “know” it. More than four million tourists visit the canyon each year, and the National Park Service funnels the vast majority of them through a tidy gantlet of attractions confined to a relatively short stretch of the South Rim. Even people who have never visited America’s greatest natural wonder have seen so many photographs of the panorama from Grandview Point or Mather Point that the place seems familiar to them. But the canyon is a wild and unknowable place—both vast (the national park alone covers about 1,902 square miles, about the size of Delaware) and inaccessible (the vertical drops vary from 3,000 feet to more than 6,000). The chasm lays bare no fewer than 15 geological layers, ranging from the rim-top Kaibab Limestone (250 million years old) to the river-bottom Vishnu Schist (as old as two billion years). The most ecologically diverse national park in the United States, the Grand Canyon embraces so many microclimates that hikers can posthole through snowdrifts on the North Rim while river runners on the Colorado below are sunbathing in their shorts. Among the canyon’s many enigmas, one of the most profound is its prehistory—who lived here, and when, and how, and why. At first blush, the Grand Canyon looks like a perfect place for ancient peoples to have occupied, for the Colorado River is the most abundant and reliable source of water in the Southwest. Yet before the river was dammed, it unleashed recurring catastrophes as it flooded its banks and scoured out the alluvial benches where ancients might have been tempted to dwell and farm. For all its size and geological variety, the canyon is deficient in the kinds of natural alcoves in which prehistoric settlers were inclined to build their villages. And—as Bill, Greg and I discovered that May morning—it can be fiendishly difficult to navigate. “The canyon’s got a lot to offer, but you have to work hard for it,” says National Park Service archaeologist Janet Balsom. “It’s really a marginal environment.” And yet the Grand Canyon is riddled with prehistoric trails, most of which lead from the rim down to the riverbed. Some of them are obvious, such as the routes improved by the park service into such hikers’ boulevards as the Bright Angel and South Kaibab trails. Most of the others are obscure. Archaeologists have largely left them to be explored by a few fanatically devoted climbers. The archaeology of other Southwestern regions—New Mexico’s Chaco Canyon, for instance, or Colorado’s Mesa Verde—has yielded a far more comprehensive picture of what it was like a millennium or so ago. Says Balsom: “You have to remember, only 3.3 percent of the Grand Canyon has been surveyed, let alone excavated.” Only in the past 50 years have archaeologists focused significant attention on the Grand Canyon—sometimes digging in places so remote they had to have helicopter support—and only recently have their efforts borne much fruit. Broadly speaking, archaeological evidence shows that humans have roamed the canyon for more than 8,000 years. The dimmest hint of a Paleo-Indian presence, before 6500 b.c., is succeeded by rock art and artifacts from a vivid but mysterious florescence of Archaic hunter-gatherers (6500 to 1250 b.c.). With the discovery of how to cultivate corn, bands of former nomads started building semipermanent villages on canyon terraces sometime before 1000 b.c. Two millennia later, by a.d. 1000, at least three distinct peoples flourished within the canyon, but their identities and ways of living remain poorly understood. From a.d. 1150 to 1400, there may have been a hiatus during which the entire canyon was abandoned—why, we can only guess. Today, just one group of Native Americans—the Havasupai—lives within the canyon. And even though their elders can recite origin stories with unblinking self-assurance, the tribe presents anthropologists with puzzles every bit as vexing as the ones that cling to the vanished ancients. The blank spaces in the timeline, the lost connections between one people and another, confound experts who only slowly are illuminating the lives that were lived so long ago below the rim. The Grand Canyon has frustrated Western explorers from the beginning. The first Europeans to behold it were a splinter party from Francisco Vásquez de Coronado’s monumental Southwest entrada of 1540-42. Their commander dispatched them to chase down a rumor about “a large river” to the west. “Several days down the river,” some Hopi informants had told them, “there were people with very large bodies.” Guided by four Hopi men, this party, headed by one García López de Cárdenas, took 20 days to reach the Grand Canyon—at least twice as long as it should have. Apparently, the Hopi were leading Cárdenas’ men the long way around to divert them from their own vulnerable villages. Cárdenas’ guides took the soldiers to a point on the South Rim not far from where the three of us slid off the precipice that morning in May 2005, choosing one of the few stretches where no trail led into the canyon. Misjudging the scale of the gorge, the Spaniards thought the river below a mere six feet wide, instead of more than a hundred yards. Cárdenas sent his three nimblest scramblers over the edge to find a way down, but after three days—during which they got only a third of the way—they returned to report that the descent was impossible. Cárdenas, who was hoping to find an easy route to the Pacific, turned back in exasperation. The first U.S. explorer to reach the Colorado River within the Grand Canyon was a government surveyor, Lt. Joseph C. Ives, who did it with guidance from Hualapai Indians in 1858. He was no more pleased than Cárdenas. The entire region, he swore in his official report, was “altogether valueless.” That judgment did not prevent John Wesley Powell from boating down the Colorado River in 1869, nor a wave of miners from invading the canyon in the 1880s, nor the establishment of the Grand Canyon National Monument in 1908 and National Park in 1919. In 1933, three Civilian Conservation Corps workers building a trail in the canyon took an off day to explore a remote cave. As they were hunting for Indian objects inside it, they later told their boss, they discovered three figurines, each made from a single willow twig. It seemed that the objects, each less than a foot in height, had been secreted away in one of the most inaccessible niches. Since then, more than 500 such figurines have been discovered. On a windy, rainy day, Bill, Greg and I stopped by the Grand Canyon National Park Museum Collection, where curator Colleen Hyde pulled about a dozen of these split-twig figurines out of their storage drawers. They ranged in length from an inch to 11 inches, but all had been made by the same method. Each artist had taken a stick of willow or skunkbush and split it lengthwise until it was held together only at one end, then folded the two ends around each other until the second could be tucked inside a wrapping formed by the first. The result appears to be an effigy of either a deer or a bighorn sheep, both of which would have been an important source of food. In recent years, many of the figurines have been carbon-dated, yielding dates ranging from 2900 to 1250 b.c.—squarely in the late Archaic period of this region. Except for a pair of broken projectile points, they are the oldest artifacts ever found in the Grand Canyon. The Archaic hunter-gatherers—people who had yet to discover corn or pottery or the bow and arrow—held to this rigorous artistic tradition for nearly 17 centuries, or about as long as the span from late Roman statuary to Jackson Pollock. Across the Southwest, only two areas are known to have produced split-twig figurines. A cluster centered in canyons in southeastern Utah consists of effigies wrapped according to a different method, producing a different-looking animal, and they are found only in domestic contexts, including trash dumps. But all of the Grand Canyon figurines have been discovered in deep caves in the Redwall Limestone stratum—by far the most difficult geologic layer in the canyon to climb through, because its sheer precipices lack handholds and footholds. In these caves, the objects were placed under flat rocks or small cairns, and no accompanying relics have ever been found. There is no evidence that Archaic people ever lived in these caves, and some of the caves are so difficult to get into that modern climbers would have to use ropes and hardware to do it. (Because there must be dozens, or even hundreds, of figurines yet to be discovered, the park service forbids exploration of the caves in the Redwall band, should anyone be bold enough to try.) And yet no one knows why the figurines were made, although some kind of hunting magic has long been the leading hypothesis. Among those we saw in the museum collection were several that had separate twigs stuck into the bodies of the sheep or deer, like a spear or dart. In a 2004 paper, Utah archaeologists Nancy J. Coulam and Alan R. Schroedl cite ethnographic parallels among such living hunter-gatherers as Australian Aborigines to argue that the figurines were fetishes used in a ritual of “increase magic,” and that they were the work not of individualistic shamans, but of a single clan, lasting 60 generations, that adopted the bighorn sheep as its totem. These hunters may have believed that the Grand Canyon was the place of origin of all bighorn sheep; by placing the figurines deep inside caves, under piles of rocks, they might have sought to guarantee the continued abundance of their prey. That the caves sometimes required very dangerous climbing to enter only magnified the magic. Coulam and Schroedl’s theory is both bold and plausible, yet so little is known about the daily lives of the Archaic people in the Grand Canyon that we cannot imagine a way to test it. The figurines speak to us from a time before history, but only to pose a riddle. The riddles of the Grand Canyon are not confined to prehistoric times, either, as a trip among the present-day Havasupai makes clear. They live 2,000 vertical feet below the rim, on Havasu Creek. As an old trail plunges through four geologic layers, the reddish sandstone walls broaden to accommodate the ancient village of Supai in one of the most idyllic natural oases in the American West. A few miles upstream, one of the Grand Canyon’s most powerful springs sends a torrent of crystalline blue-green water down the ravine. (The people here call themselves Ha vasúa baaja, or “people of the blue-green water.”) The calcium carbonate that gives the creek its color renders it undrinkable, but the Havasupai draw their water from an abundance of other springs and seeps on the edges of their village. By the time of their first contact with Europeans, as it happens in 1776, the Havasupai had long since adjusted to a seasonal round that defies logic but seems to have worked superbly for them. In spring, summer and early autumn they lived in the canyon, planting and harvesting. Then they moved back to the rim, where, at an altitude of more than 6,000 feet, they camped in the snow and spent the winter hunting and gathering. With the coming of Anglo-Americans, that cycle of life changed. In 1882, after miners began gouging holes in the cliff walls in their quest for silver, lead and gold, the U.S. government restricted the Havasupai to the 518 acres of their village. From then on, they could no longer hunt or gather on the South Rim. Other Havasupai families lived in mid-canyon glades, such as Indian Gardens, the halfway point on today’s Bright Angel Trail. Gradually, however, they were nudged out by encroaching tourism. As late as the 1920s, a park service employee called the Havasupai a “doomed tribe” amounting to “less than two hundred wretched weaklings.” But today, the Havasupai number some 650 men, women and children. And in 1974, Congress returned much of the people’s traditional land to them, in the largest restoration ever bestowed on a Native American tribe. The Havasupai Reservation today covers more than 185,000 acres, where, ironically, the tourists have become guests of the people of the blue-green water. A number of those tourists come by helicopter; most hike in to Supai with light daypacks while Native wranglers bring their duffels on horseback or muleback. The chief draw for most visitors, however, is not the village, with its cornfields and pastures full of sleek horses, but three spectacular waterfalls downstream. Bill, Greg and I backpacked the eight miles and 2,000 feet down into Supai, looking less for the Spring Break atmosphere of high tourist season than for a chance to plumb the past. On our second day, Rex Tilousi, who was then the tribal chairman, held our nosy questions at arm’s length for an hour or so, but then relented and took us on an amble through his boyhood neighborhood. With his flowing silver hair, Colonel Sanders goatee and weather-beaten visage, Tilousi cut a striking figure. And his monologue blended sly satire with ancestral grievances. Referring to the miners, Tilousi recalled, “Here came the hairy man from the East, looking for the shining rock, wanting to get rich.” And then, more solemnly, “If it had been up to us, we never would have let the miners come down here.” The tourist campground, built by the park service before 1974, lies “right on top of where we used to cremate our people,” Tilousi told us. “It disturbs me sometimes to see that campground, but we need income from the tourists.” He stroked his goatee and said, “Our ancestors lie there. Then the government said, ‘You can’t do that anymore.’ So now we have to bury our dead, just like everybody else.” We paused beside a giant cottonwood as Tilousi pointed to a high cliff to the west. “See those two white marks up there?” Through binoculars I discerned a pair of white alkaline streaks made by seeping water in the ruddy cliff, seemingly inaccessible below the distant rim. “Those are two ears of corn, placed there by the Creator,” Tilousi said. “We pray to them, asking for plenty.” The Havasupais’ welcome mat is something of a facade, Tilousi admitted. Archaeologists had asked Havasupai to interpret the “rock writings”—had even, he insisted, taken chisels to certain petroglyph panels—but the people had objected. “We feel we should never tell anyone besides ourselves” what the rock art means, he said. “We don’t know what you want to do with that knowledge.” Visitors without guides are forbidden to explore the canyon beyond the main trail that leads down to the waterfalls, so the next day we hired two Havasupai in their mid-30s. Genial-faced Benjy Jones had the build of a sumo wrestler; Damon Watahomigie had less girth, a sharper mien and a fund of lore. We had hiked only 15 minutes when he stopped and pointed out a knob of rock far above us on the western rim. “See the frog?” he asked. The knob indeed looked like a frog preparing to jump. “The story is that the people were living at Wi-ka-sala—Beaver Canyon, on your maps—when all the waters receded,” Watahomigie said. “Everything was dying because of the new age. We weren’t people then; we were animals and insects. The chief sent out the frog to find a place where we could begin again. The frog hopped all over, until he finally found this place. He could hear the Colorado River.” We craned our necks, staring at the distant rock formation. “It was like Noah sending out the dove,” Watahomigie concluded. Looking for rock art, we headed off the trail and up a steep slope choked with brush and cactus. Jones produced a leaf cradling an oily, dark red paste made from hematite, or iron oxide, a clay that Native Americans often used as a paint. One of the Havasupais’ most treasured substances, hematite from the canyon has been found east of the Mississippi River, traded prehistorically over more than a thousand miles. Jones dipped his finger in the paste, then dabbed a streak on each of our boot soles. “Keeps the rattlesnakes away,” he explained. As the day wheeled on, we crisscrossed the canyon, with our guides leading us to rock art panels and ruins that few visitors ever see. There were several our guides wouldn’t let us visit. “The ones that are closed, we aren’t supposed to bother them,” Watahomigie said. By “closed,” I assumed he meant having stone-slab doors intact. His caution implies that the cliff buildings were the work of an earlier people. Archaeologists have debated Havasupai origins for half a century, strenuously and inconclusively. Some insist that a people called the Cohonina became the Havasupai. Others argue that the Havasupai, along with their linguistic cousins the Hualapai and Yavapai, are what they call Cerbat peoples, fairly recent migrants from the Great Basin of Nevada after a.d. 1350. Like many other Native American peoples, the Havasupai usually say they have lived forever in the place they inhabit. But when we asked Tilousi how long his people had lived in the canyon of the blue-green water, he did not go quite that far. “I wasn’t here billions of years ago,” he said. “I can’t put numbers to the years that have gone by. I will just say, since the beginning of the ice age.” On our last day in the Grand Canyon, Bill, Greg and I made a pilgrimage to a shrine deep in a little-traveled side valley that, like the Redwall caves guarding the split-twig figurines, had in all likelihood been an Archaic place of power. As we wound down a faint trail across an increasingly barren landscape, I saw nothing that even hinted at a prehistoric presence—not a single potsherd or chert flake in the dirt, not the faintest scratchings on a wayside boulder. But when we entered a small gorge in the Supai Sandstone stratum, a deep orange cliff loomed on our left about 50 feet above the dry creekbed. Halfway up, a broad ledge gave access to a wall that severely overhung above it. We scrambled up to the ledge. During the previous 20 years, I had found hundreds of rock art panels in backcountry all over the Southwest. I knew the hallmarks of the styles by which experts have categorized them—Glen Canyon Linear, Chihuahuan Polychrome, San Juan Anthropomorphic and the like. But the Shamans’ Gallery, as this rock art panel has been named, fit none of those taxonomic pigeonholes. It was perhaps the most richly and subtly detailed panel I’d ever seen. Across some 60 feet of arching sandstone, vivid back-to-back figures were rendered in several colors, including two shades of red. Most of the figures were anthropomorphic, or human-shaped, and the largest was six feet tall. Polly Schaafsma, a leading expert on Southwestern rock art, has argued that the Shamans’ Gallery (which she named) was painted before 1000 b.c., based on the style of the figures. She feels that it embodies the visionary trances of religious seers—shamans. The rock shelter where the artists recorded their visions, she believes, must have been a sacred site. Had these ancient artists been part of the troupe (or clan) that had climbed into the Redwall caves to hide split-twig figurines? We have no way of knowing and no foreseeable way of finding out. But no matter. After two hours on the ledge, I stopped filling my notebook and simply stared. I tried to rid my mind of its Western, analytic itch to figure out what the paintings “meant” and surrendered to their eerie glory. In the presence of the Shamans’ Gallery, ignorance led to an unexpected kind of bliss. David Roberts is is a veteran mountain climber and author of 27 books, including The Mountain of My Fear and Deborah. His latest book, The Lost World of the Old Ones, which chronicles archeological discoveries in the ancient Southwest, is due out this spring.
daf8c876c8e432f658dddaded440e4eb
https://www.smithsonianmag.com/history/benedict-arnold-turned-traitor-american-revolution-180958786/
Why Benedict Arnold Turned Traitor Against the American Revolution
Why Benedict Arnold Turned Traitor Against the American Revolution He was short, solidly built (one acquaintance remembered that “there wasn’t any wasted timber in him”) and blessed with almost superhuman energy and endurance. He was handsome and charismatic, with black hair, gray eyes and an aquiline nose, and he carried himself with the lissome elegance of a natural athlete. A neighbor from Connecticut remembered that Benedict Arnold was “the most accomplished and graceful skater” he had ever seen. Valiant Ambition He was born in 1741, a descendant of the Rhode Island equivalent of royalty. The first Benedict Arnold had been one of the colony’s founders, and subsequent generations had helped to establish the Arnolds as solid and respected citizens. But Arnold’s father, who had settled in Norwich, Connecticut, proved to be a drunkard; only after his son moved to New Haven could he begin to free himself from the ignominy of his childhood. By his mid-30s he had had enough success as an apothecary and a seagoing merchant to begin building one of the finest homes in town. But he remained hypersensitive to any slight, and like many gentlemen of his time he had challenged more than one man to a duel. From the first, he distinguished himself as one of New Haven’s more vocal and combative patriots. On hearing of the Boston Massacre, he thundered, “Good God, are the Americans all asleep and tamely giving up their glorious liberties?” When in April 1775 he learned of the skirmishes at Lexington and Concord, he seized a portion of New Haven’s gunpowder supply and marched north with a company of volunteers. In Cambridge, Massachusetts, he convinced Dr. Joseph Warren and the Massachusetts Committee of Safety to authorize an expedition to capture Fort Ticonderoga in New York State and its 80 or more cannons. As it turned out, others had the same idea, and Arnold was forced to form an uneasy alliance with Ethan Allen and his Green Mountain Boys before the two leaders strode side by side into Ticonderoga. While Allen and his men turned their attention to consuming the British liquor supply, Arnold sailed and rowed to St. John, at the opposite end of Lake Champlain, where he and a small group of men captured several British military vessels and instantly gave America command of the lake. Abrupt and impatient with anything he deemed superfluous to the matter at hand, Arnold had a fatal tendency to criticize and even ridicule those with whom he disagreed. When a few weeks later a Continental Army officer named James Easton dared to question the legitimacy of his authority as the self-proclaimed commodore of the American Navy on Lake Champlain, Arnold proceeded to “kick him very heartily.” It was an insult Easton never forgot, and in the years ahead, he became one of a virtual Greek chorus of Arnold detractors who would plague him for the rest of his military career. And yet, if a soldier served with him during one of his more heroic adventures, that soldier was likely to regard him as the most inspiring officer he had ever known. This story is a selection from the May issue of Smithsonian magazine The American Revolution as it actually unfolded was so troubling and strange that once the struggle was over, a generation did its best to remove all traces of the truth. Although it later became convenient to portray Arnold as a conniving Satan from the start, the truth is more complex and, ultimately, more disturbing. Without the discovery of his treason in the fall of 1780, the American people might never have been forced to realize that the real threat to their liberties came not from without, but from within. ********** In that first Revolutionary spring of 1775, Arnold learned of the death of his wife, Margaret. Upon returning from Lake Champlain to New Haven, he visited her grave with his three young sons at his side. Arnold’s letters to her prior to the Revolution had been filled with pleas for her to write more often, and his grief upon her death seems to have been almost overpowering. And yet, for someone of Arnold’s restless temperament, it was inconceivable to remain in New Haven with his sorrow. “An idle life under my present circumstances,” he explained, “would be but a lingering death.” After just three weeks, Arnold left his children under the care of his sister Hannah and was on his way back to Cambridge, where he hoped to bury his anguish in what he called “the public calamity.” Over the next three years—in Canada, on Lake Champlain, in Rhode Island and Connecticut and again in New York—he made himself indispensable to his commander in chief, George Washington, and the Revolutionary cause. It is impossible to say when 37-year-old Benedict Arnold first met 18-year-old Peggy Shippen, but we do know that on September 25, 1778, he wrote her a love letter—much of it an exact copy of one he’d sent to another woman six months before. But if the overheated rhetoric was recycled, Arnold’s passion was genuine. Knowing of “the affection you bear your amiable and tender parents,” he had also written to Peggy’s loyalist-leaning father. “Our difference in political sentiments will, I hope, be no bar to my happiness,” he wrote. “I flatter myself the time is at hand when our unhappy contest will be at an end.” He also assured Peggy’s father that he was wealthy enough “to make us both happy” and that he had no expectations of any kind of dowry. Here in this letter are hints as to the motives behind Arnold’s subsequent behavior. While lacking the social connections of the Shippens, who were the equivalent of Philadelphia aristocracy, Arnold had had prospects of accumulating a sizable personal fortune. Now the British had abandoned their occupation of the revolutionaries’ capital, and Washington, needing something for Arnold to do while he recuperated from a battle-shattered left thigh, had named him the city’s military governor. Having lost once-significant wealth, Arnold embarked on a campaign of secret, and underhanded, schemes to re-establish himself as a prosperous merchant. That end—and those means—were not uncommon among officers of the Continental Army. But in September 1778 he did not yet have the money he needed to maintain Peggy in the style to which she was accustomed. There was also the matter of the Shippens’ politics. They might not be outright loyalists, but they had a decided distaste for the radical patriots who were waging an undeclared war on Philadelphia’s upper classes now that the British had gone. Given Arnold’s interest in Edward Shippen’s daughter and his lifelong desire to acquire the wealth his bankrupt father had denied him, it is not surprising that he embraced the city’s marginalized nobility with a vengeance. Thumbing his nose at the pious patriots who ruled the city, he purchased an ornate carriage and entertained extravagantly at his new residence, the same grand house the British general William Howe had occupied. He attended the theater, even though the Continental Congress had advised the states to ban such entertainments as “productive of idleness, dissipation and general depravity.” He issued passes to suspected loyalists wanting to visit friends and relatives in New York City, which was held by the British. He even appeared at a ball in a scarlet uniform, which led a young lady whose father had been arrested for corresponding with the British to joyfully exclaim, “Heyday, I see certain animals will put on the lion’s skin.” ********** One of Arnold’s misfortunes was that Joseph Reed had become a champion, however unlikely, of Pennsylvania’s radical patriots. A London-educated lawyer with an English wife, Reed had a reputation as one of Philadelphia’s finest and most ambitious attorneys before the Revolution. But the Reeds had not fit well into the upper echelons of Philadelphia society. Reed’s pious wife complained that one of Peggy Shippen’s relatives had accused her of being “sly,” claiming that “religion is often a cloak to hide bad actions.” Reed had served on Washington’s staff as adjutant general at the beginning, when Washington faced the daunting task of dislodging the British from Boston in 1775. But by the end of the year, with the Continental Army run out of New York City and retreating across New Jersey, he had lost faith in his commander. Reed was away from headquarters when a letter arrived from the army’s second-ranking officer, Maj. Gen. Charles Lee. Assuming the letter related to official business, Washington promptly broke the seal. He soon discovered that Reed had established his own line of communication with Lee and that the primary topic of their correspondence was the failings of their commander in chief. Washington forwarded the letter to Reed with a note explaining why he had opened it, but otherwise let him twist in the icy emptiness of his withheld wrath. He kept Reed on, but their intimacy had ended. Brilliant, mercurial and outspoken, Reed had a habit of antagonizing even his closest friends and associates, and he eventually left Washington’s staff to serve in a variety of official capacities, always restless, always the smartest, most judgmental person in the room. As a New England minister wrote to Washington, the man was “more formed for dividing than uniting.” In the fall of 1778, Reed stepped down as a Pennsylvania delegate to Congress to help the state’s attorney general prosecute 23 suspected loyalists for treason. He lost 21 of those cases—there wasn’t much evidence to work with—but the position established him as one of the city’s most zealous patriots. That November, the two wealthy Quakers who had been convicted were hanged. In an apparent act of protest, Arnold hosted “a public entertainment” at which he received “not only Tory [or loyalist] ladies but the wives and daughters of persons proscribed by the state” in “a very considerable number,” Reed sputtered in a letter to a friend. Perhaps contributing to his ire was the fact that he and his wife had recently moved into the house next to Arnold’s and hadn’t been invited to the party. By December Reed was president of the state’s Supreme Executive Council, making him the most powerful man in one of the most powerful states in the country. He quickly made it clear that conservative patriots were the enemy, as were the Continental Congress and the Continental Army. As council president, he insisted that Pennsylvania prevail in any and all disputes with the national government, regardless of what was best for the United States as a whole. Philadelphia was at the vortex of an increasingly rancorous struggle involving almost all the seminal issues related to creating a functioning democratic republic, issues that would not begin to be resolved until the Constitutional Convention of 1787. Amid all this upheaval, Reed launched an investigation into the military governor’s conduct. The prosecution of Benedict Arnold—a Washington favorite, an emblem of national authority and a friend to Philadelphia’s wealthy—would be the pretext to flex his state’s political muscle. And it would lead Arnold to doubt the cause to which he had given so much. ********** By late January 1779, Arnold was preparing to leave the military. Officials in New York State, where he was held in high regard, had encouraged him to consider becoming a landowner on the scale of the loyalist Philip Skene, whose vast estate at the southern tip of Lake Champlain had been confiscated by the state. Arnold’s financial dealings in Philadelphia had failed to yield the anticipated returns. Becoming a land baron in New York might be the way to acquire the wealth and prestige that he had always craved and that Peggy and her family expected. By early February he had decided to journey to New York, stopping to visit Washington at his headquarters in New Jersey. Reed, fearing that Arnold might escape to New York before he could be brought to justice for his sins in Philadelphia, hurriedly put together a list of eight charges, most of them based on rumor. Given the pettiness of many of the charges (which included being ungracious to a militiaman and preferring loyalists to patriots), Reed appeared to be embarked on more of a smear campaign than a trial. That Arnold was guilty of some of the more substantive charges (such as illegally purchasing goods upon his arrival in Philadelphia) did not change the fact that Reed lacked the evidence to make a creditable case against him. Arnold knew as much, and he complained of his treatment to Washington and the commander’s family of officers. Washington had refused to take sides in the dispute between Philadelphia’s radicals and conservatives. But he knew that Reed was hardly the steadfast patriot he claimed to be. For the last year, a rumor had been circulating among the officers of the Continental Army: Reed had been in such despair over the state of the war in late December 1776 that he’d spent the night of Washington’s assault on Trenton at a home in Hessian-occupied New Jersey, poised to defect to the British in the event of an American defeat. In that light, his self-righteous prosecution of Quakers and other loyalists seemed hypocritical in the extreme. It’s likely that Washington had heard at least some version of the claim, and just as likely that he took the charges against Arnold with a grain of salt. Still, Reed’s position on the Supreme Executive Council required that Washington accord him more civility than he probably deserved. On February 8, 1779, Arnold wrote to Peggy from the army’s headquarters in Middlebrook, New Jersey. “I am treated with the greatest politeness by General Washington and the officers of the army,” he assured her. He claimed that the consensus at headquarters was that he should ignore the charges and continue on to New York. Despite this advice, he had resolved to return to Philadelphia, not only to clear his name but because he was so desperately missing Peggy. “Six days’ absence without hearing from my Dear Peggy is intolerable,” he wrote. “Heavens! What must I have suffered had I continued my journey—the loss of happiness for a few dirty acres. I can almost bless the villainous...men who oblige me to return.” In utter denial regarding his complicity in the trouble he was now in, he was also deeply in love. ********** Back in Philadelphia, Arnold came under near-ceaseless attack from the Supreme Executive Council. But since the council was unwilling to provide the required evidence—primarily because it did not have any—the Congressional committee appointed to examine the charges had no choice but to find in Arnold’s favor. When the council threatened to withhold the state militia and the large number of state-owned wagons upon which Washington’s army depended, Congress tabled its committee’s report and turned the case over to Washington for a court-martial. More than a few Congressional delegates began to wonder what Reed was trying to accomplish. As a patriot and a Philadelphian, Congress’s secretary Charles Thomson had once considered Reed a friend. No more. Reed’s refusal to bring forward any legitimate evidence, combined with his continual assaults on the authority and integrity of Congress, made Thomson wonder whether his former friend was trying to destroy the political body upon which the country’s very existence depended. Was Reed, in fact, the traitor? The previous summer Reed had received an offer of £10,000 if he would assist a British peace commission’s efforts with Congress. In a letter published in a Philadelphia newspaper, Reed claimed to have indignantly refused the overture. But had he really? One of the commissioners had recently assured Parliament that secret efforts were under way to destabilize the government of the United States and that these “other means” might prove more effective in ending the war than military attempts to defeat Washington’s army. There is no evidence that Reed was indeed bent on a treasonous effort to bring down Congress, but as Thomson made clear in a letter to him, his monomaniacal pursuit of Arnold was threatening to accomplish exactly that. ********** In the meantime, Arnold needed money, and fast. He had promised Edward Shippen that he would bestow “a settlement” on his daughter prior to their marriage as proof that he had the financial resources Peggy’s father required. So in March of 1779, Arnold took out a loan for £12,000 and, with the help of a sizable mortgage, bought Mount Pleasant, a mansion on 96 acres beside the Schuylkill that John Adams had once claimed was “the most elegant seat in Pennsylvania.” There was one hitch, however. Although he had technically purchased Peggy a mansion, they were not going to be able to live in it, since Arnold needed the rental payments from the house’s current occupant to help pay the mortgage. Harassed by Reed, carrying a frightening burden of debt, Arnold nonetheless had the satisfaction of finally winning Edward Shippen’s consent, and on April 8, he and Peggy were married at the Shippens’ house. Now Arnold had a young, beautiful and adoring wife who was, he proudly reported the next morning to several of his friends, good in bed—at least that was the rumor the Marquis de Chastellux, a major general in the French Army who was fluent in English, heard later when visiting Philadelphia. However, within just a few weeks, Arnold was finding it difficult to lose himself in the delights of the connubial bed. Reed had not only forced a court-martial upon Arnold; he was now attempting to delay the proceedings so that he could gather more evidence. What’s more, he had called one of Washington’s former aides as a witness, an even more disturbing development since Arnold had no idea what the aide knew. Arnold began to realize that he was, in fact, in serious trouble. Aggravating the situation, his left leg was not healing as quickly as he had hoped, and his right leg became wracked by gout, making it impossible for him to walk. Arnold had been in tight spots before, but always had been able to do something to bring about a miraculous recovery. But now, what was there to do? If the last nine months had taught him anything, it was that the country to which he had given everything but his life could easily fall apart. Instead of a national government, Congress had become a facade behind which 13 states did whatever was best for each of them. Indeed, it might be argued that Joseph Reed was now more influential than all of Congress combined. What made all of this particularly galling was the hostility that Reed— and apparently most of the American people—held toward the Continental Army. More and more Americans regarded officers like Arnold as dangerous hirelings on the order of the Hessian mercenaries and British regulars, while local militiamen were looked to as the patriotic ideal. In reality, many of these militiamen were employed by community officials as thuggish enforcers to terrorize local citizens whose loyalties were suspect. In this increasingly toxic and volatile environment, issues of class threatened to transform a collective quest for national independence into a sordid and self-defeating civil war. By the spring of 1779, Arnold had begun to believe that the experiment in independence had failed. And as far as he could tell, the British had a higher regard for his abilities than his own country did. Gen. John Burgoyne was in London defending himself before Parliament with the claim that if not for Arnold, his army would have won the Battle of Saratoga. That February, the Royal Gazette had referred sympathetically to his plight in Philadelphia: “General Arnold heretofore had been styled another Hannibal, but losing a leg in the service of the Congress, the latter considering him unfit for any further exercise of his military talents, permit him thus to fall into the unmerciful fangs of the executive council of Pennsylvania.” Perhaps the time was right for him to offer his services to the British. ********** Arnold is usually credited with coming up with the idea himself, but there are reasons to think the decision to turn traitor originated with Peggy. Certainly the timing is suspect, following so soon after their marriage. Arnold was bitter, but even he had to admit that the Revolution had catapulted him from the fringes of respectability in New Haven to the national stage. Peggy, on the other hand, regarded the Revolution as a disaster from the start. Not only had it initially forced her family to flee from Philadelphia; it had reduced her beloved father to a cringing parody of his former self. How different life had been during those blessed months of the British occupation, when noble gentleman-officers had danced with the belles of the city. With her ever-growing attachment to Arnold fueling her outrage, she had come to despise the revolutionary government that was now trying to destroy her husband. By marrying Peggy, Arnold had attached himself to a woman who knew how to get what she wanted. When her father had initially refused to allow her to marry Arnold, she had used her seeming frailty—her fits, her hysteria, whatever you wanted to call it—to manipulate him into agreeing to the engagement for fear that she might otherwise suffer irreparable harm. Now she would get her way with her equally indulgent husband. Given the ultimate course of Arnold’s life, it is easy to assume that he had fully committed himself to treason by the time he sent out his first feelers to the British in early May 1779. But that was not the case. He still felt a genuine loyalty to Washington. On May 5, Arnold wrote his commander what can only be described as a hysterical letter. The apparent reason for it was the delay of his court-martial to June 1. But the letter was really about Arnold’s fear that he might actually do as his wife suggested. “If your Excellency thinks me criminal,” he wrote, “for heaven’s sake, let me be immediately tried and if found guilty executed.” What Arnold wanted more than anything now was clarity. With the court-martial and exoneration behind him, he might fend off Peggy’s appeals. Joseph Reed, however, was bent on delaying the court-martial for as long as possible. In limbo like this, Arnold was dangerously susceptible to seeing treason not as a betrayal of all he had held sacred but as a way to save his country from the revolutionary government that was threatening to destroy it. In his anguish on May 5, he offered Washington a warning: “Having made every sacrifice of fortune and blood, and become a cripple in the service of my country, I little expected to meet the ungrateful returns I have received of my countrymen, but as Congress have stamped ingratitude as a current coin I must take it. I wish your Excellency for your long and eminent services may not be paid of in the same coin.” In the reference to money, Arnold unintentionally betrayed the real reason he had been moved to consider this course. If he handled the negotiations correctly, turning traitor could be extremely lucrative. Not only would he be able to walk away from his current financial obligations, he might command a figure from the British that would make him independently wealthy for life. On May 10, an emissary from Arnold reached John André, a British captain whom Peggy had come to know well in Philadelphia. But now André was living in New York City, which would become crucial to the Revolution’s prospects in the months ahead. Arnold wanted to explore the possibility of defecting, but first he needed to be assured of two things: Were the British in this war to stay? And how much were his services worth? In the tortuous months ahead, Arnold would survive his oft-delayed court-martial with a reprimand, and Washington would restore him to command. But the emissary’s visit was the first tentative step that led, in late summer-fall of 1780, to Arnold’s doomed effort to hand over the fortifications at West Point to the enemy. By reaching out to the British, Arnold gave his enemies the exquisite satisfaction of having been right all along. Like Robert E. Lee at the beginning of the American Civil War, Arnold could have declared his change of heart and simply shifted sides. But as he was about to make clear, he was doing this first and foremost for the money. Editor-in-chief Michael Caruso interviewed author Nathaniel Philbrick on our Facebook page about Benedict Arnold. Watch the video and follow us for more great history stories from Smithsonian magazine and ​Smithsonian.com. Nathaniel Philbrick is the award-winning author of several books, including In the Heart of the Sea and Bunker Hill. His writing has appeared in The New York Times Book Review, the Wall Street Journal, Vanity Fair and other places.
af47f3799689577e98a39a3054dc457b
https://www.smithsonianmag.com/history/black-lives-certainly-mattered-abraham-lincoln-180976963/
Black Lives Certainly Mattered to Abraham Lincoln
Black Lives Certainly Mattered to Abraham Lincoln Last month, the San Francisco Unified School District voted to rename Abraham Lincoln High School because of the former president’s policies toward Native Americans and African Americans. As Jeremiah Jeffries, chairman of the renaming committee and a first grade teacher, argued, “Lincoln, like the presidents before him and most after, did not show through policy or rhetoric that black lives ever mattered to them outside of human capital and as casualties of wealth building.” Such a statement would have perplexed most Americans who lived through the Civil War. On January 1, 1863, Lincoln issued the Emancipation Proclamation, which declared enslaved people in areas under Confederate control to be “forever free.” Two years later he used all of the political capital he could muster to push the 13th Amendment through Congress, permanently abolishing slavery in the United States. Lincoln’s treatment of Native Americans, meanwhile, is a complex issue. Writing for Washington Monthly in 2013, Sherry Salway Black (Oglala Lakota) suggested that the “majority of his policies proved to be detrimental” to Indigenous Americans, resulting in significant loss of land and life. Critics often cite Lincoln’s approval of the executions of 38 Dakota men accused of participating in a violent uprising; it remains to this day the largest mass execution in United States history. Lincoln’s detractors, however, often fail to mention that the president pardoned or commuted the sentences of 265 others, engaging in “by far the largest act of executive clemency in American history,” per historian James M. McPherson in The New York Times. The San Francisco committee opted not to consult any historians when considering the renaming, which Jeffries justified by saying, “What would be the point? History is written and documented pretty well across the board. And so, we don’t need to belabor history in that regard.” But the point should be belabored. During the Civil War, Lincoln worked assiduously to expand rights for African Americans. In response, most black Americans who lived through the war looked to him with great admiration and respect. Among the thousands of letters that arrived at the White House during the Civil War, at least 125 came from African Americans. Their missives discussed a wide range of topics, including military service, inequality in society, the need for financial assistance, and the protection of their rights. One black soldier, for example, wrote, “i have ben sick Evy sence i Come her and i think it is hard to make A man go and fite and wont let him vote . . . rite soon if you pleze and let me no how you feel.” Other constituents sent gifts and poems to the president. To be sure, Lincoln saw very few of these letters, as his private secretaries typically routed them to other federal departments. But when presented with a case in which he could intervene, Lincoln often did so. Some of the most touching letters showed the personal connection that enslaved men and women felt with the president. In March 1865, one black refugee from Georgia wrote, “I take this opportunity this holy Sabbath day to try to express my gratitude and love to you. With many tears I send you this note through prayer and I desire to render you a thousand thanks that you have brought us from the yoke of bondage. And I love you freely.” He then proceeded to describe a dream he’d had many years before, in which “I saw a comet come from the North to the South and I said good Lord what is that?” The man’s enslaver “threatened my life if I should talk about this. But I just put all my trust in the Lord and I believe he has brought me conqueror through.” The comet in this dream, this correspondent believed, was Lincoln. The president, in turn, was so touched by the letter that he kept it in his personal collection of papers, which is now housed at the Library of Congress. Lincoln also met hundreds of African Americans in Washington during the war years. Some came to the White House at his invitation; others walked through the White House gates uninvited and unannounced. Regardless of how they arrived at his doorstep, the president welcomed these visitors with open arms and an outstretched hand. As Frederick Douglass was proud to say after his first White House meeting in August 1863, Lincoln welcomed him “just as you have seen one gentleman receive another.” Black visitors to the White House often remarked that Lincoln treated them with dignity and respect. Many were touched by how he shook their hands and made no acknowledgement of their race or skin color. Lincoln’s hospitality toward African Americans came to be well known at the time: As white Union nurse Mary Livermore observed, “To the lowly, to the humble, the timid colored man or woman, he bent in special kindliness.” Writing in 1866, a Washington journalist similarly noted that the “good and just heart of Abraham Lincoln prompted him to receive representatives of every class then fighting for the Union, nor was he above shaking black hands, for hands of that color then carried the stars and stripes, or used musket or sabre in its defense.” Lincoln appears to have always shaken hands with his black guests. And, in almost every instance, he seems to have initiated the physical contact, despite the fact that shaking hands, for Lincoln, could be an understandably tiresome chore. “[H]e does it with a hearty will, in which his entire body joins,” wrote one observer, so that “he is more weary after receiving a hundred people than some public men we could all name after being shaken by a thousand.” Yet the president warmly, kindly, eagerly and repeatedly grasped the hands of his black guests. This seemingly small gesture should not be discounted, for it carried not only great personal meaning for the visitors, but also important symbolic meaning for all Americans who witnessed the encounters or read about them in the newspapers. Most white politicians would not have been so genuinely welcoming to African Americans. As historian James O. Horton and sociologist Lois E. Horton wrote in 1998, black Americans “often worked with white reformers … who displayed racially prejudiced views and treated [them] with paternalistic disrespect,” including refusals to shake their hands. Reformers continued to offer snubs like this in the postwar period. During his run for the presidency in 1872, for example, newspaper publisher Horace Greeley ostentatiously showed disdain for a black delegation from Pennsylvania that sought to shake his hand. Not so with Lincoln. On April 29, 1864, a delegation of six black men from North Carolina—some born free, others enslaved—came to the White House to petition Lincoln for the right to vote. As the men approached the Executive Mansion, they were directed to enter through the front door—an unexpected experience for black men from the South, who would never have been welcomed this way in their home state. One of the visitors, Rev. Isaac K. Felton, later remarked that it would have been considered an “insult” for a person of color to seek to enter the front door “of the lowest magistrate of Craven County, and ask for the smallest right.” Should such a thing occur, Felton said, the black “offender” would have been told to go “around to the back door, that was the place for niggers.” In words that alluded to the Sermon on the Mount, Felton likened Lincoln to Christ: “We knock! and the door is opened unto to us. We seek, the President! and find him to the joy and comfort of our hearts. We ask, and receive his sympathies and promises to do for us all he could. He didn’t tell us to go round to the back door, but, like a true gentleman and noble-hearted chief, with as much courtesy and respect as though we had been the Japanese Embassy he invited us into the White House.” Lincoln spoke with the North Carolinians for some time. He shook their hands when they entered his office and again when the meeting ended. Upon returning home, the delegation reported back to their neighbors about how “[t]he president received us cordially and spoke with us freely and kindly.” Outside of the White House, Lincoln also showed kindness toward the black Americans he encountered. In May 1862, he visited an army hospital at Columbian College (now George Washington University) where a white nurse introduced him to three black cooks who were preparing food for sick and wounded soldiers. At least one of the cooks had been previously enslaved. Lincoln greeted them in “a kindly tone,” recalled the nurse. “How do you do, Lucy?” he said to the first. The nurse then remarked that he stuck out his “long hand in recognition of the woman’s services.” Next Lincoln gave the two black men a “hearty grip” and asked them, “How do you do?” When the president left the room, the three black cooks stood there with “shining faces” that testified to their “amazement and joy for all time.” But soon, sadly, the nurse realized what the convalescing Union officers thought of this scene. They expressed a “feeling of intense disapprobation and disgust” and claimed that it was a “mean, contemptible trick” for her to introduce them to the president. Lincoln has received a good deal criticism in the modern era for his views on race. For much of his adult life—including during part of his presidency—he pushed for African Americans to voluntarily leave the United States through a process known as colonization. In August 1862, he condescendingly lectured a delegation of black Washingtonians about why they should endorse this policy. As unfortunate as this meeting appears in retrospect (and it did to many at the time as well), he invited these men to his office in order to accomplish a larger political purpose. Soon afterward Lincoln publicized his words in the newspapers, hoping that they would help prepare the northern electorate for executive action regarding slavery. In essence, he hoped to persuade white voters not to worry about emancipation because he would promote policies that were in their best interest. Meanwhile, Lincoln was planning to do something momentous and unprecedented—issue his Emancipation Proclamation. Many today also criticize Lincoln for issuing the Emancipation Proclamation as a “military necessity”—a policy to help win the war—rather than as a clarion call for justice. Such views have gained currency in the broader popular culture. In 1991, for example, Tupac Shakur rapped, “Honor a man that refused to respect us / Emancipation Proclamation? Please! / Lincoln just said that to save the nation.” But the truth is, Lincoln needed to justify his controversial action constitutionally—as a war measure—so that it could hold up in court if it were challenged. Taking this approach does not diminish Lincoln’s deeply held moral beliefs about the immorality of slavery. As he said upon signing the proclamation, “my whole soul is in it.” Indeed, Lincoln issued the proclamation out of moral duty as well as military necessity, as evidenced by a meeting he had with Frederick Douglass toward the end of the war. By August 1864, Lincoln had become convinced that he would lose reelection, allowing an incoming Democratic administration to undo all he had done to bring freedom to the enslaved. The president invited Douglass to the White House, where the two men devised a plan to encourage people still held in bondage to flee to Union lines before Lincoln would be out of office, should he lose. Lincoln said, “Douglass, I hate slavery as much as you do, and I want to see it abolished altogether.” Lincoln’s plan had nothing to do with helping him win the war (“military necessity”) or the election; it had everything to do with Lincoln’s deep-seated moral disdain for slavery. For his part, Douglass left the meeting with a new understanding of the president’s intense commitment to emancipation. “What he said on this day showed a deeper moral conviction against slavery than I had ever seen before in anything spoken or written by him,” Douglass later wrote. Fortunately, nothing ever had to come of this desperate plan. The war took a turn for the better, and Lincoln easily won reelection in November 1864. In the end, Lincoln’s welcoming of African Americans to the White House was an act of political courage and great political risk. Indeed, Douglass, probably more than any other person, understood the significance of Lincoln’s open-door policy. “He knew that he could do nothing which would call down upon him more fiercely the ribaldry of the vulgar than by showing any respect to a colored man,” said Douglass shortly after Lincoln’s death. And yet that is precisely what Lincoln did. Douglass concluded: “Some men there are who can face death and dangers, but have not the moral courage to contradict a prejudice or face ridicule. In daring to admit, nay in daring to invite a Negro to an audience at the White house, Mr. Lincoln did that which he knew would be offensive to the crowd and excite their ribaldry. It was saying to the country, I am President of the black people as well as the white, and I mean to respect their rights and feelings as men and as citizens.” For Lincoln, black lives certainly mattered. Jonathan W. White is associate professor of American Studies at Christopher Newport University and author or editor of 8 books, including Midnight in America: Darkness, Sleep, and Dreams during the Civil War (2017), which has a chapter on the dreams of African-Americans during the Civil War. Check out his website at jonathanwhite.org.
e024ecd4faf5c074cd5385cb2f04910f
https://www.smithsonianmag.com/history/blank-dead-sea-scroll-fragments-text-180974894/
Text Found on Supposedly Blank Dead Sea Scroll Fragments
Text Found on Supposedly Blank Dead Sea Scroll Fragments Hidden bits of text written in Hebrew and Aramaic have been revealed on four fragments of Dead Sea Scrolls long thought to be blank. The pieces of parchment had been excavated by archaeologists and donated to a British researcher in the 1950s, reinforcing their authenticity at a time when other supposed Dead Sea Scroll fragments have proven to be fakes. Stashed by members of a Jewish sect nearly 2,000 years ago, the Dead Sea Scrolls contain some of the oldest known fragments of the Hebrew Bible. In the 1940s and 1950s, Bedouin tribe members and archaeologists rediscovered these texts in the arid caves of Qumran, a site about 12 miles east of Jerusalem in the West Bank overlooking the Dead Sea. A few years ago, a team of researchers set out to study artifacts from the Qumran Caves that have been dispersed to museums and collections around the world. “In the early days of research, in the '50s and '60s, the excavators sometimes donated many artifacts, usually ceramics, to collaborating museums as gifts,” says Dennis Mizzi, a senior lecturer in Hebrew and ancient Judaism at the University of Malta. Mizzi and his colleagues suspected some evidence from the caves may have gotten lost or overlooked along the way as these objects were separated from their original context. They found decomposed papyrus that was previously thought to be bat dung on the inside lid of one Qumran jar. They tracked down textiles used to wrap the scrolls that had been stored in a cigarette box. But they never intended to look for lost texts. However, the researchers revisited a collection of supposedly blank Dead Sea Scroll fragments that the Jordanian government gave to a leather and parchment expert at the United Kingdom's University of Leeds in the 1950s. Because these fragments appeared “uninscribed,” they were thought worthless to text-seeking biblical scholars, but perfect for tests the Leeds researcher wanted to perform to date the scrolls. “When fragments were submitted for destructive analyses, they cut very thin specimens (not larger than a couple of mm) from the existing fragments. In other words, they did not submit entire fragments for such analyses,” adds Mizzi. That collection was donated to the University of Manchester in 1997 and remained in storage in their John Rylands Library ever since. Upon examining a supposedly blank fragment in that collection, researcher Joan Taylor of King’s College London thought she saw faint traces of a lamed—the Hebrew letter “'L.” Following this hint, 51 seemingly blank fragments bigger than 1 centimeter were submitted to be photographed. The library team used multispectral imaging, a technique that captures different wavelengths of the electromagnetic spectrum including some invisible to the naked eye. Taylor, Mizzi and their third collaborator, Marcello Fidanzio of the Faculty of Theology of Lugano, were surprised when they got the results and saw obvious lines of text on four of the fragments. “There are only a few on each fragment, but they are like missing pieces of a jigsaw puzzle you find under a sofa,” Taylor said in a statement announcing the discovery. “Some words are easily recognizable, like ‘Shabbat’,” Mizzi says. That word appears in a fragment with four lines of text, and may be related to the biblical book of Ezekiel, Mizzi says. However, he and his colleagues are only beginning to interpret the fragments, and he says it is too early to speculate on their meaning. “We’re still working to figure out the letters that are visible on the fragments,” he says. The team wants to perform further tests to elucidate the physical aspects of the artifacts, including the composition of the ink and the production of the parchment. It is rare for new, authentic pieces of text from the Dead Sea Scrolls to surface. Thankfully, these fragments have a well-documented history. The researchers know they were excavated in Cave 4 at Qumran, where the majority of the Dead Sea Scrolls were found along with thousands of fragments from around 500 texts. Compare that to about 70 new fragments of the scrolls with unknown provenance that started circulating in the antiquities market over the past two decades. Although many of these texts were interpreted by biblical scholars and appeared in academic journals and books, some researchers continued to raise skepticism about the texts’ authenticity because of their murky origins and other red flags, like the style of handwriting. The Museum of the Bible, which opened in Washington, D.C., in 2017, had 16 of those newly surfaced fragments in its collection after they were acquired by Hobby Lobby founder Steve Green. Earlier this year, an independent team of art fraud investigators determined that all 16 are modern forgeries. Robert Cargill, an associate professor of classics and religious studies at the University of Iowa who was not involved in the new study, contrasted the fragments collected by the Museum of the Bible to the “properly-excavated, less sensational” fragments in the Rylands Library, which “turned out to be the real treasures.” “Unlike the repeated scandals being reported at the Museum of the Bible, this discovery within the collection of the John Rylands Library is a reassuring success story about the use of new technological approaches in archaeology,” Cargill says, “and a reminder of the importance of provenanced objects that may not appear sensational at first glance.” Megan Gannon is a science journalist who often writes about archaeology and space. She was previously a news editor at Live Science and Space.com.
048a7430118daa37e4d9c879dfaef176
https://www.smithsonianmag.com/history/bloody-attempt-kidnap-british-princess-180950202/
The Bloody Attempt to Kidnap a British Princess
The Bloody Attempt to Kidnap a British Princess There were seven men in total who tried to stop Ian Ball, an unemployed laborer from north London, from kidnapping Princess Anne, Queen Elizabeth’s only daughter. A tabloid journalist, a former boxer, two chauffeurs and three policemen all faced off against Ball, but it was the princess herself, a force to be reckoned with in her own right, who kept Ball distracted from his goal, Around 8 p.m. on a March 20, 1974, Princess Anne and her husband of four months were heading towards Buckingham Palace after attending a charity film screening. Anne’s lady-in-waiting sat across from the couple in the back of a maroon Rolls-Royce limousine marked with the royal insignia, and in the passenger seat rode her bodyguard: Inspector James Wallace Beaton, a member of SO14, Scotland Yard’s special operations branch charged with royalty protection. As the chauffeur drove down the Mall, a road that runs between London’s Trafalgar Square and Buckingham Palace, a white Ford Escort overtook and forced him to stop about 200 yards away from the palace. A bearded man with light red hair exited the car and, holding two handguns, charged towards the rear of the limo. Inspector Beaton, 31, assumed that the man was a disgruntled driver and stepped out to meet him. From six feet away, the assailant shot the officer in his right shoulder. **** In aiming to kidnap Anne, Ian Ball was targeting the celebrity royal of Britain’s day. The previous November, the 23-year-old princess had married a commoner – Mark Phillips, a Captain in the British army. The two had met through equestrian circles: the talented horseman had won a team gold medal at the 1972 Munich Olympics, and in 1971, the BBC had named Anne, later an Olympian equestrian along with Phillips in the 1976 games, as its Sports Personality of the year. Their nuptials attracted 2,000 guests, and The New York Times said the televised audience of 500 million was “the most ever” for a wedding. In a piece that indicates that the media’s fascination with celebrity hasn’t changed all that much, NYT journalist John J. O’Connor wrote that “network television’s coverage blitz” was “lacking much substance” and “could only leave the average viewer puzzled and blinking.” On the night of the kidnapping attempt, SO14 had only assigned one man to protect the princess, but then again only one bodyguard accompanied Queen Elizabeth on unofficial trips to and from her residence at the time. Although Ball would not have known the route that the limousine would take that night, the palace had publicized Princess Anne’s appearance at the event, potentially making it easy for someone to follow the maroon Rolls-Royce as it escorted her from the theater that evening. A 26-year-old victim of mental illness, Ball had rented a car under the name of John Williams, in which police would later find two pairs of handcuffs, Valium tranquilizers, and a ransom letter addressed to the Queen. He had typed a rambling note that criticized the royal family and demanded a £2 million ransom to be delivered in £5 sterling notes. Ball asked that the Queen have the money stored in 20 unlocked suitcases and put on a plane destined for Switzerland. Queen Elizabeth II herself, wrote Ball, need to appear on the plane to confirm the authenticity of her signatures on needed paperwork. *** Although few of London’s Metropolitan police carried guns, those assigned to protect the royal family carried automatic weapons. Inspector Beaton tried to shoot Ian Ball, but his wounded shoulder hurt his aim. After firing once, his gun jammed. Ball turned to the rear door behind the driver’s seat and started shaking it. Anne sat on the other side. “Open, or I’ll shoot!” he yelled. As the princess and Captain Phillips did their best to hold the door shut, Princess Anne’s lady-in-waiting crawled out of the door on the passenger side. Beaton took the opportunity to jump back in the limo. He placed himself between the couple and their assailant, who shot into the car. Beaton’s hand deflected the bullet. Ball then shot him a third time, causing a wound that forced Beaton out of the car and onto the ground. Chauffeur Alexander Callendar, one of the Queen’s drivers, stepped out to confront the gunman. Ball shot him in the chest and Callender fell back into the car. Pulling the back door open, Ball grabbed Anne’s forearm as Phillip held onto her waist. “Please, come out,” said Ball to Anne. “You’ve got to come.” As the two men struggled over Anne, her dress ripped, splitting down the back. Instead of panicking, she had what she later called “a very irritating conversation” with her potential kidnapper. “I kept saying I didn’t want to get out of the car, and I was not going to get out of the car,” she told police. In response to one of Ball’s pleas, Princess Anne retorted, “Bloody likely.” “I was frightened, I won’t mind admitting it,” Captain Phillips later said.  The scariest part, he remembered, was feeling like a caged animal when police officers started arriving. Then “the rescue was so near, but so far” as constables hesitated to advance on an armed man so near the princess. Police Constable Michael Hills, 22, was first on the scene. Patrolling nearby when he heard the sounds of a struggle, he assumed the conflict was over a car accident. He approached Ball and touched his shoulder. The gunman turned and shot Hills in the stomach. Before collapsing, Hills maintained enough strength to radio his station. Ronald Russell, a company cleaning executive, was driving home from work when he saw the scene on the side of the road. He approached on foot after seeing Ian Ball confront Officer Hills. “He needs sorting,” Russell later remembered thinking. A 6’4” former boxer, Russell advanced to punish the shooter for hurting a policeman. Another motorist, a chauffeur named Glenmore Martin, had parked his car in front of the white Ford to keep Ball from escaping. He also tried to distract Ball, but when the gunman aimed at him, Martin turned to help Officer Hills on the side of the road. Meanwhile, Daily Mail journalist John Brian McConnell came onto the scene. Recognizing the insignia on the limo, he knew a member of the royal family was in danger. “Don’t be silly, old boy,” he said to Ball. “Put the gun down.” Ball shot him. McConnell fell to the road, now the third man bleeding onto the pavement. After McConnell fell, Ball turned back to his struggle for Princess Anne. Ronald Russell approached from behind and punched Ball in the back of the head. While the former boxer distracted the gunman, Anne reached for the door handle on the opposite side of the backseat. She opened it and pushed her body backwards out of the car. “I thought that if I was out of the car that he might move,” she said. She was right. As Ball ran around the car towards the princess, she jumped back in with Phillips, shutting the door. Ronald Russell then punched Ball in the face. More police officers were now witnessing the action. Princess Anne noticed their presence made Ian Ball nervous. “Go on,” she said. “Now’s your chance.” He took off running. Peter Edmonds, a temporary detective constable, had heard Officer Hills’ call regarding the attack. As he pulled up to the scene in his own car, he saw a man take off with a gun through St. James Park. Edmonds chased Ball, threw his coat over Ball’s head, tackled him and made an arrest. Authorities found over £300 in £10 notes on his person. Later, they learned that earlier that month, Ball had rented a home on a dead-end road in Hampshire, five miles away from Sandhurst Military Academy, also the home of Princess Anne and Captain Phillips. The next day, headlines around America reviewed the night’s events: “Princess Anne Escapes Assassin”; “Lone Gunman Charged in Royal Kidnap Plot”; “Security Increases Around Prince Charles;” “Witnesses Describe Panic on the Mall”; “Queen is Horrified at Attack on Princess.” “If someone had tried to kidnap Julie Eisenhower Nixon on Park Avenue,” wrote The New York Times, the press would create “within a day or two” a “lavish portrait of that someone.” Because of British laws that limited pre-trial publicity, “just about all that Brits are likely to know for the next month or two they know already.” Home Secretary Roy Jenkins ordered an investigative report for the Prime Minister and told the press that the investigation needed to remain “broadly confidential;” both Scotland Yard and Buckingham Palace refused to comment on specific details. Journalists scrambled to pull together theories on how a mentally ill, unemployed man could have masterminded a well-funded kidnapping attempt on his own. An office clerk told a reporter that the police had traced a typewriter that Ball had rented to write the ransom letter. Papers reported that one line of the letter read “Anne will be shot dead.” Days after the kidnapping attempt, a group calling themselves the Marxist-Leninist Activist Revolutionary Movement sent a letter claiming responsibility to The Times of London. Scotland Yard dismissed any connection between that group and Ian Ball. Others recognized a familiar theme in the reported content of the ransom letter, in which Ball had allegedly stated that he would donate the Queen’s ransom to the National Health Services. One month before, a group identifying as the Symbionese Liberation Army had kidnapped Patricia Hearst. In its communication with the Hearst family, the SLA said that they would return the young woman if her family donated what would amount to millions of dollars of food to hungry Californians. “There is no present indication that this was other than an isolated act by an individual,” Jenkins told the House of Commons. It agreed with his request that the findings of the investigation remain confidential. Secretary Jenkins told the papers that he ordered an increase in royal protection but refused to comment on the details. Buckingham Palace released a statement saying that the royal family “had no intention of living in bullet-proof cages.” Chief among them was Princess Anne, who valued her privacy even after recognizing fortune in escaping un-scathed. “There was only one man,” she later said. “If there had been more than one it might have been a different story.” The princess recognized in an interview that one’s “greatest danger” is perhaps “the lone nutcases” that “have just got enough” resources to put a crime together. “If anybody was serious on wiping one out, it would be very easy to do.” When Ian Ball appeared in court on April 4, his lawyer spoke about his history of mental illness, but Ball also gave a statement on what motivated his crime: “I would like to say that I did it because I wished to draw attention to the lack of facilities for treating mental illness under the National Health Service.” Ian Ball pleaded guilty to attempted murder and kidnapping charges. Sentenced to a life term in a mental health facility, he has spent at least part of his internment at Broadmoor, a high-security psychiatric hospital. Even after Ian Ball’s sentencing, the public would know little else about him except for his birth date and birthplace, and eyewitness accounts of his appearance and actions. In 1983, Ball penned a letter to a member of Parliament in which he claimed that the attempted kidnapping was a hoax, and that he was framed. (Scotland Yard’s investigation remained closed until January 1, 2005. The British National Archives released them in honor of “the thirty year rule,” which requires the release of cabinet papers 30 years after their filing.) Less than ten years after the botched kidnapping, the press criticized Scotland Yard again for failing to protect the royal family when in July of 1982 an unemployed man scaled the palace walls and snuck into Queen Elizabeth’s bedroom. The two talked for ten minutes before the queen could summon help. The following year, Scotland Yard reorganized the Royalty Protection Branch and placed James Wallace Beaton as its superintendent. The day after the attack, Princess Anne and Captain Mark Phillips returned to routine at their home on the grounds of Sandhurst: he instructed cadets on the rifle range, and she tended to her horses. That September, Queen Elizabeth II awarded the George Cross, Britain’s highest civilian award for courage, to Inspector Beaton. She presented the George Medal, the second-highest civilian honor for bravery, to Police Constable Hills and Ronald Russell, and Queen’s Gallantry medals (the third-highest) to Police Constable Edmonds, John Brian McConnell and Alexander Callender. Glenmore Martin received the Queen’s Commendation for Brave Conduct. While Scotland Yard refuses to release specifics on SO14, an internal police budget in 2010 revealed that it spent approximately 113.5 million pounds on royal security. By 2012, this number reportedly decreased to £50 million. As part of the revised budget, Scotland Yard slashed monies dedicated to protecting “non-working royals,” such as Prince Andrew’s daughters (and Anne’s nieces), Princesses Eugenie and Beatrice, except for when they are at official family events. Prince Andrew privately hired security to accompany his daughters, fearing for their safety as his mother feared for Anne’s 40 years ago. In a 2006 interview, Ronald Russell recalled what Queen Elizabeth said as she presented his George Cross medal: “The medal is from the Queen of England, the thank you is from Anne’s mother.” Carrie Hagen is a writer based in Philadelphia. She is the author of We Is Got Him: The Kidnapping that Changed America, and is currently writing a book about the Vigilance Committee.
85463d75a836d12759479259b7ef554f
https://www.smithsonianmag.com/history/botulism-outbreak-gave-rise-americas-food-safety-system-180969868/
The Botulism Outbreak That Gave Rise to America’s Food Safety System
The Botulism Outbreak That Gave Rise to America’s Food Safety System My seventh-grade science teacher repeated two facts so often that they are still crystal clear in my memory. The first was the definition of osmosis: “the passing of a substance from a lesser concentration to a greater concentration through a semi-permeable membrane.” The other was this: dented canned food can poison you with botulism, the deadliest toxin on the planet Why these two facts seemed among the most important things to teach 12-year-olds in the 1990s is not exactly clear, but it stands to reason that at least the latter fact came from inherited wisdom. This middle-aged teacher in Arkansas had likely heard about botulism in canned food from his own mother and grandmother, seizing upon it as this singularly cool fact, relevant in the kitchen and in the science classroom. The terror of the botulism bacteria and the chaos it could wreak belied the boring, innocuous image of the tin can. By the time I was sitting at that molded plastic school desk, it was hard for Americans to imagine anything less scary than canned food. In a nation of Lunchables and DunkAroos, we believed in the power and safety of the food industry, of which canned food was a part. But I later became a student of history and, by a funny turn of events, began to study the history of canned food. I learned of a time when cans were novel and unfamiliar, and when they inspired distaste, fear, and panic. These experiences still shape America, and how it eats, today. Canned food got its start in the opening years of the 19th century in France and moved to America by 1825, but only began to enter average American homes in the years after the Civil War. The war exposed millions of soldiers to canned food, and they brought the taste home with them. But the new industry also struggled to convince American consumers to consider its products viable and trustworthy. There were many reasons why early consumers weren’t all that interested in trying these new offerings. For one, the long hours that cans of food were boiled left the contents mushy, with an unattractive texture and taste. But even before tasting the food, many Americans were skeptical. To people accustomed to seeing and touching and smelling the foods they were about to eat, these hard-sided, opaque metal objects did not seem like food. The new method of industrial production and new way of eating felt foreign to American consumers, who had grown up eating food that was more local, more perishable, and easier to fit into existing categories. As the United States entered an era of industrialization and urbanization, the unfamiliar can embodied this time of rapid change. In the half-century after the war, innovations followed as the canning men—and they were mostly all men—built their business from the ground up, hoping to overcome consumer resistance. The canners perfected machinery to build the cans and process the fruits and vegetables; they organized professional trade groups; they worked with agricultural scientists to breed crops better fit for the can; and they invited government regulation as they helped craft pure food laws. The American food supply has undergone a revolution, moving away from a system based on fresh, locally grown goods to one dominated by packaged foods. How did this come to be? How did we learn to trust that food preserved within an opaque can was safe and desirable to eat? One central problem that the canners worked to address was spoilage. Even though the canning process killed existing bacteria and created a vacuum seal to keep more bacteria from getting in, the method wasn’t always foolproof. If the temperature of the water bath was too low, or it boiled unevenly, or the pressure was insufficient, or the cans weren’t processed long enough, or the seals were weak—or if there were any other flaw in the process—spoilage could occur. Canners thus invested in bacteriology and public health oversight. With the acceptance of germ theory in the late 19th century, canners embraced this new awareness of the microbial life that could wreak such outsized havoc, seeing it as a key to solving their spoilage issues. Beginning in the 1890s, the industry sponsored scientific work to address bacterial contamination. Before long, canners felt they had gained control over this microscopic foe. Most canned food spoilage is fairly obvious—either the can itself becomes deformed or its contents are visibly spoiled—and relatively harmless, perhaps leading to digestive upset or mild illness. But there was one rare kind of bacteria that was far from harmless: Clostridium botulinum. This bacteria produces botulinum, the deadliest toxin known to humankind, which can’t be detected by sight, smell, or taste. Botulism doesn’t itself cause cans to be externally deformed, neither dented nor bulging, but those external signs often suggest an insufficient canning process, which can breed both botulism and other kinds of bacteria that have more visible effects. Botulism is also anaerobic, meaning it thrives in oxygen-free environments, precisely like that of canned food. Though it was rare, botulism terrified canners. Their worst fears materialized in late 1919 and early 1920, when a series of deadly botulism cases struck unassuming consumers throughout the country, killing 18 people in Ohio, Michigan, and New York, with smaller outbreaks in other states. The deaths were traced back to canned black olives, a mainstay of hors d’oeuvre plates and a delicacy often reserved for special occasions. The olives had been packed in California and then shipped across the country to far-flung destinations, the result of a newly nationalized commercial food system. The National Canners Association and California Canners League sprang into action, recognizing the particular vulnerability of this moment. These botulism deaths—widely publicized in mainstream media outlets—threatened to undermine the still-shaky foundation of the canned food business, fueling consumers’ deepest fears about these processed foods. The canners worked on two fronts. Even as they sought to displace responsibility and downplay media coverage of the deaths, they launched an expensive research and inspection campaign that would lay the groundwork for the American food safety system. In early December 1919, the canning and olive industries came together to fund a Botulism Commission of scientific experts tasked with producing specific strategies for safely processing olives to prevent such a crisis from happening again. After much negotiation, the Botulism Commission’s findings led to strict regulations for the processing of olives—240 degrees Fahrenheit for at least 40 minutes—and a statewide inspection service, funded by the industries, but overseen by the impartial California State Board of Health. By 1925, many of these standardized practices had expanded to other food products, covering sardines, tuna, and all vegetable products except tomatoes. In the process, three distinct groups—scientists, canners, and government officials—established a set of relationships. As they got to know each other and worked through their competing commitments and quirks, they built the network that would underpin the nation’s food system. Because the canning industry had taken a lead role in this network, many critical consumers were mollified, leading to acceptance of canned food, and later processed food, in the decades to come. This small story of a food scare and an emerging industry’s embrace of food safety regulation encapsulates the larger story of American commerce in the 20th century. In solving the problem of botulism, an industry threatened with destruction instead came back with a set of practices that not only revolutionized canned food, but the entire relationship between science, government, and the food industry in America today. In this early phase, the canners were as much a player in policing themselves as were external regulators. By the time I heard that questionable information about botulism from my science teacher in the 1990s, I was part of a food system awash in processed foods. By then, dented cans—or any cans—were very unlikely to harbor botulism bacteria, which had been largely brought under control by those new processing methods and regulations. This paved the way for our contemporary American food culture, in which we eat and unthinkingly trust processed food. Yes, the country still experiences occasional and ongoing food safety outbreaks. But rarely are these from canned food, which—along with the vast array of food products that line our lunchboxes and grocery store shelves—has escaped the reputation that first inspired my teacher’s inherited wisdom generations ago. Of course, the definition of osmosis is still pretty much the same. Anna Zeide is a historian and Assistant Professor of Professional Practice at Oklahoma State University. She is the author of Canned: The Rise and Fall of Consumer Confidence in the American Food Industry.
81a21dce7bed78573e8016f0fe92c76d
https://www.smithsonianmag.com/history/brief-history-anti-fascism-180975152/
A Brief History of Anti-Fascism
A Brief History of Anti-Fascism Eluard Luchell McDaniels traveled across the Atlantic in 1937 to fight fascists in the Spanish Civil War, where he became known as “El Fantastico” for his prowess with a grenade. As a platoon sergeant with the Mackenzie-Papineau Battalion of the International Brigades, the 25-year-old African American from Mississippi commanded white troops and led them into battle against the forces of General Franco, men who saw him as less than human. It might seem strange for a Black man to go to such lengths for the chance to fight in a white man’s war so far from home—wasn’t there enough racism to fight in the United States?—but McDaniels was convinced that anti-fascism and anti-racism were one and the same. “I saw the invaders of Spain [were] the same people I’ve been fighting all my life," Historian Peter Carroll quotes McDaniels as saying. "I’ve seen lynching and starvation, and I know my people’s enemies.” McDaniels was not alone in seeing anti-fascism and anti-racism as intrinsically connected; the anti-fascists of today are heirs to almost a century of struggle against racism. While the methods of Antifa have become the object of much heated political discourse, the group’s ideologies, particularly its insistance on physical direct action to prevent violent opression, are much better understood when seen in the framework of a struggle against violent discrimination and persecution began almost a century ago. Historian Robert Paxton’s Anatomy of Fascism—one of the definitive works on the subject—lays out the motivating passions of fasicsm, which include “the right of the chosen group to dominate others without restraint from any kind of human or divine law”. At its heart, fascism is about premising the needs of one group, often defined by race and ethnicity over the rest of humanity; anti-fascists have always opposed this. Anti-fascism began where fascism began, in Italy. Arditi del Popolo—"The People’s Daring Ones”—was founded in 1921, named after the Italian army’s shock troops from World War I who famously swam across the Piave River with daggers in their teeth. They committed to fight the increasingly violent faction of blackshirts, the forces encouraged by Benito Mussolini, who was soon to become Italy’s fascist dictator. The Arditi del Popolo brought together unionists, anarchists, socialists, communists, republicans and former army officers. From the outset, anti-fascists began to build bridges where traditional political groups saw walls. Those bridges would quickly extend to the races persecuted by fascists. Once in government, Mussolini began a policy of "Italianization" that amounted to cultural genocide for the Slovenes and Croats who lived in the northeastern part of the country. Mussolini banned their languages, closed their schools and even made them change their names to sound more Italian. As a result, the Slovenes and Croats were forced to organize outside of the state to protect themselves from Italianization, and allied with anti-fascist forces in 1927. The state responded by forming a secret police, the Organizzazione per la Vigilanza e la Repressione dell'Antifascismo, the Organization for vigilance and repression of anti-fascism (OVRA), which surveilled Italian citizens, raided opposition organizations, murdered suspected anti-fascists, and even spied on and blackmailed the Catholic Church. Anti-fascists would face off against the OVRA for 18 years, until an anti-fascist partisan who used the alias Colonnello Valerio shot Mussolini and his mistress with a submachine gun in 1945. Similar dynamics presented themselves as fascism spread across pre-war Europe. The leftists of Germany’s Roter Frontkämpferbund (RFB) first used the famous clenched-fist salute as the symbol of their fight against intolerance; when, in 1932, they became Antifaschistische Aktion, or “antifa” for short, they fought Nazi anti-Semitism and homophobia under the flags with the red-and-black logo that antifa groups wave today. That fist was first raised by German workers, but would go on to be raised by the Black Panthers, Black American sprinters Tommy Smith and John Carlos at the 1968 Olympics and Nelson Mandela, among many others. In Spain, anti-fascist tactics and solidarity were put to the test in 1936, when a military coup tested the solidarity among working and middle class groups who were organized as a board based popular front against fascism. The anti-fascists stood strong and became an example of the power of the people united against oppression. In the early days of the Spanish Civil War, the Republican popular militia was organized much like modern antifa groups: They voted on important decisions, allowed women to serve alongside men and stood shoulder to shoulder with political adversaries against a common enemy. Black Americans like McDaniels, still excluded from equal treatment in the U.S. military, served as officers in the brigades of Americans who arrived in Spain ready to fight against the fascists. Overall, 40,000 volunteers from Europe, Africa, the Americas and China stood shoulder to shoulder as antifascist comrades against Franco’s coup in Spain. In 1936 there were no black fighter pilots in the U.S., yet three black pilots—James Peck, Patrick Roosevelt, and Paul Williams—volunteered to fight the fascists in the Spanish skies. At home, segregation had prevented them from achieving their goals of air combat, but in Spain they found equality in the anti-fascist ranks. Canute Frankson, a black American volunteer who served as head mechanic of the International Garage in Albacete where he worked, summed up his reasons for fighting in a letter home: We are no longer an isolated minority group fighting hopelessly against an immense giant. Because, my dear, we have joined with, and become an active part of, a great progressive force on whose shoulders rests the responsibility of saving human civilization from the planned destruction of a small group of degenerates gone mad in their lust for power. Because if we crush Fascism here, we’ll save our people in America, and in other parts of the world from the vicious persecution, wholesale imprisonment, and slaughter which the Jewish people suffered and are suffering under Hitler’s Fascist heels. In the United Kingdom, anti-fascists became an important movement as anti-Semitism emerged as a salient force. In October 1936, Oswald Mosley and the British Union of Fascists attempted to march through Jewish neighborhoods in London. Mosley's 3,000 fascists, and the 6,000 policemen who accompanied them, found themselves outnumbered by the anti-fascist Londoners who had turned out to stop them. Estimates of the crowd vary from 20,000 to 100,000. Local children were recruited to roll their marbles under the hooves of police horses, while Irish dockworkers, Eastern European Jews, and leftist workers stood side-by-side to block the marchers' progress. They raised their fists, like German anti-fascists, and chanted “No pasaran” ("They shall not pass!", the slogan of the Spanish militia), and they sung in Italian, German and Polish. They succeeded: The fascists did not pass, and Cable Street became a symbol of the power of a broad anti-fascist alliance in shutting down fascist hate speech on the streets. During the Second World War, anti-fascism passed into its second stage, as it moved from the streets to stand alongside those in the seats of power. Winston Churchill and other imperialists stood against fascism even as they stood for the colonialism that left Indian people to starve to support their war effort. An alliance between committed anti-fascists and temporary anti-Nazis was formed. It’s become a social media meme of sorts that those who fought in the Second World War were anti-fascists, but this strains at the core of anti-fascist belief. The U.S. military that defeated the Nazis alongside the Allies was segregated, black troops were relegated to second class roles and could not serve alongside white troops in the same unit. Anti-fascism opposed the primacy of any group; anti-fascist soldiers in Spain had stood next to Black comrades as equals, American troops in the Second World War did not. After the war, anti-fascism left the corridors of power and returned to the streets. Britain had fought against fascism, but never exorcised its homegrown hate and quickly released detained fascist sympathizers after the war. British Jewish ex-servicemen who had fought fascism on the battlefields of Europe, returned home to see men like Mosley continue to deliver anti-Semitic and anti-immigrant rhetoric in spaces. Through new organizations they founded, they would soon infiltrate Mosley’s speeches and literally deplatform him by rushing the stage and pushing it over. The same anti-immigrant logic that sustained Mosley’s fascism in the U.K. later appeared in Germany in the 1980s, and again antifascists stepped up to confront hate and racism in the form of Nazi skinheads who had begun to infiltrate the punk scene. This so-called third wave of anti-fascism embraced tactics like squatting while reviving the raised fist and black and red logos used by their grandparents in the 1930s . The most radical and numerous squats were found in Hamburg, where diverse groups of young people occupied empty buildings as part of an urban counterculture that rejected both the Cold War and the legacy of fascism. When German football club FC St Pauli moved its stadium nearby, the anti-racist, anti-fascist culture of the squats became the club’s guiding principle. Even as anti-immigrant enthusiasm had returned to German politics in the 1980s, and football fan culture turned racist and violent, some German football fans—most notably those of the St. Pauli club—stood up against racism. This fan culture became legendary among the global left and the club itself embraced it: Today, the St. Pauli stadium is painted with slogans such as “no football for fascists,” “football has no gender,” and “no human being is illegal.” They've even set up a team for refugees. The team, with its skull and crossbones logo borrowed from Hamburg’s 14th century anti-authoritarian pirate hero Niolaus Stoertebeker, might represent the coolest anti-fascism has ever been. I’ve seen their stickers in the filthy bathrooms of punk shows on three continents and saw that skull and crossbones flag at a Black Lives Matter rally this week. But today's anti-fascism isn’t about waving flags at football matches; it's about fighting, through direct action, racists and genocidaires wherever they can be found. Anti-fascist volunteers, drawing on the experience of their predecessors in Spain, have been quietly slipping through international cordons to northeastern Syria since 2015 to fight against against Isis and Turkish conscripts. In the Syrian region known as Rojava, just as in Republican Spain, men and women fight side by side, raise their fists for photographs and proudly display the black-and-red flag logo as they defend the Kurdish people abandoned by the world. When Italian volunteer Lorenzo Orzettiwas killed by ISIS in 2019, the men and women of Rojava sung "Bella Ciao," an anti-fascist ditty from 1920s Italy. The song grew popular in the mountains of Syria nearly 90 years later, and today there are dozens of Kurdish recordings available. Just as anti-fascism protected persecuted Slovenes and Croats, it takes up arms to defend Kurdish autonomy today. Back in Germany, the St. Pauli keep up with the news from their confederates in Syria, and fans often hold up colored cards to form the flag of Rojava at games. And, of course, anti-fascism has made a resurgence in the United States. In 1988 Anti-Racist Action was formed, on the basis that anti-racism and anti-fascism are one and the same and that the ARR name might be more obvious to people in the U.S. In California, Portland, Pennsylvania, Philadelphia, New York and across the country, autonomous groups have emerged to fight the rise in hate speech, stand by LGBTQIA and BIPOC people, and combat hate crime. In Virginia, the local clergy relied on Antifa to keep people safe during the “Untie the RIght” rally of 2017. Using the logo of the 1930s German antifa, the raised fist of the RFB, and the slogan No pasaran, these groups have stood in front of racists and fascists in Los Angeles, Milwaukee, and New York—just as their predecessors did at Cable Street. Even though accusations have been leveled at Antifa for turning recent protests violent, little evidence exists that those affiliated with the anti-fascist cause have been behind any violence. Anti-fascism has changed a lot since 1921. Today's anti-fascist activists spend as much time using open-source intelligence to expose white supremacists online as they do building barricades in the street. Just as their predecessors did in Europe, anti-fascists use violence to combat violence. This has earned them a reputation as “street thugs” in some parts of the media, just as was the case at Cable Street. The Daily Mail ran the headline “Reds Attack Blackshirts, Girls Among Injured” the day after that battle, which is now largely seen as a symbol of intersectional shared identity among the London working class. When Eluard McDaniels returned home from Spain, he was barred from employment as a merchant sailor, and his colleagues were labeled “premature anti-fascists'' by the FBI, even though the United States would end up fighting against the same Nazi Pilots just three years later. The last U.S. volunteer from the Spanish Civil War, a white Jewish man named Delmer Berg, died in 2016 aged 100. Berg, who was pursued by the FBI and blacklisted during the McCarthy Era, served as the vice president of his county’s NAACP branch, organized with the United Farm Workers and the Mexican-American Political Association, and credited his intersectional activism as the key to his longevity. On the occasion of Berg’s death, Senator John McCain wrote an op-ed saluting this brave, “unreconstructed communist.” Politically, Mccain and Berg would have agreed on very little, and McCain notably avoided discussing the persecution Berg and his comrades faced on their return to America, but McCain did quote a poem by John Donne—the same poem that gave Hemingway’s novel about the Spanish Civil War its title. By quoting Donne, McCain suggests that anti-fascism as a basic human impulse, and Donne's poem captures the expansive humanitarian view that would motivate anti-fascists 300 years later: Each man's death diminishes me, For I am involved in mankind. Therefore, send not to know For whom the bell tolls, It tolls for thee. James Stout is a historian of anti-fascism in sport and a freelance journalist. His research is partially funded by the IOC Olympic Studies Centre and the PhD Students and Early Academics Research Grant Programme.
d6a468b96ce00c959898b28a8775c60e
https://www.smithsonianmag.com/history/brief-history-cooties-180971914/
A Brief History of Cooties
A Brief History of Cooties Of all the germs kids are exposed to on the playground, there’s one they freak out about more than any other: cooties. The word first appeared during World War I as soldiers’ slang for the painful body lice that infested the trenches. It went mainstream in 1919 when a Chicago company incorporated the pest into the Cootie Game, in which a player maneuvered colored “cootie” capsules across a painted battlefield into a cage. The cooties concept has been evolving ever since. The most familiar incarnation has features of a real infectious disease even as it says a good deal about what 6-year-olds think of the opposite sex. Every little girl knows that boys have cooties, and vice versa. One catches cooties by—eww!—touching. Shrieking games of cooties tag transmit the contagion rapidly. It can be treated with an origami “cootie catcher,” but it is better to be vaccinated. This requires a friend and a retractable pen. Your friend clicks the pen onto your arm while chanting “circle, circle, dot, dot, now you have your cootie shot.” Folklore archives and internet forum threads show that regional variations of the therapeutic regimen have emerged. In Louisville, the charm is “line, line, dot, dot, operation cootie shot”; in Los Angeles, kids “pinch, pinch” in lieu of the “dot, dot”; in Hawaii, the process is known as an “uku shot.” To historians and social scientists, the cooties phenomenon isn’t just child’s play. Kids, after all, are their own “semiliterate society” with their own cultural touchstones, says Simon Bronner, a folklorist at Penn State Harrisburg who has studied children’s traditions. The purpose of something like the cootie shot, passed down from generation to generation, “must be profoundly important if all these kids are choosing to participate in it,” says Tok Thompson, an anthropologist at the University of Southern California who studies modern folklore. Play helps kids make sense of new ideas, experiences and emotions, not to mention traditional gender roles. The cootie shot itself is part teaching tool, part coping mechanism. Bronner has observed the emergence of this form of cooties in the 1950s, when the polio vaccine became ubiquitous, and a spike in its popularity in the 1980s, during the height of the AIDS epidemic. Nowadays, cooties also reflect other concerns, particularly physical appearance; an obese child, for instance, might be said to have cooties. There’s a greater emphasis on body shaming, Bronner says. Like a real virus, cooties mutate, and they’ll likely be around for as long as children have insecurities to play out. Cooties weren’t just for kids. As early as 1921, a dice game called Cootie was a favorite at wedding showers. Women competed in teams of two to draw the very bug that had tormented many a husband-to-be during the war. One woman threw a die; the other was the artist. A six earned the team a cootie body; a five, the head; a four, one of six legs; and so on. The game was still popular among brides in 1949 when a Minnesota inventor created a 3-D version (below) in which players built cooties from colorful plastic pieces. It became a big hit with children and is still in production today. This article is a selection from the May issue of Smithsonian magazine Jane C. Hu is a journalist based in Seattle. Her writing has appeared in several publications, including WIRED and The Atlantic.
061b5b7bb1c6aecce692124f3aa97930
https://www.smithsonianmag.com/history/brief-history-gif-early-internet-innovation-ubiquitous-relic-180963543/
A Brief History of the GIF, From Early Internet Innovation to Ubiquitous Relic
A Brief History of the GIF, From Early Internet Innovation to Ubiquitous Relic What do Barack Obama, the sloth from Zootopia, and a bear waving its paw have in common? All were named “most popular in 2016” for that most zeitgeist-y of Internet memes: animated GIFs. Since their creation 30 years ago, the looping clips have followed a rocky path to stardom, going from ubiquitous to repudiated and back again. Whether you love them or decry their infantilizing impact on language, it’s impossible to go long without seeing them on the news, social media, or even in office Slack rooms. Thanks to the humble GIF, no emotions are too big or small to capture in animated image form. Developer Steve Wilhite and his team at tech giant CompuServe had a problem to solve: how to make a computer display an image while also saving memory. It was 1987, four years before the advent of the World Wide Web, when users who wanted to access email or transfer files did so with hourly subscriptions from companies like CompuServe. Then as now, the issue was space. How could a color image file be shared without taking up too much of the computer’s memory? Wilhite found a way to do so using a compression algorithm (more on this soon) combined with image parameters like the number of available colors (256). His new creation could be used for exchange images between computers, and he called it Graphics Interchange Format. The GIF was born. (For the record, Wilhite pronounces his creation with a soft G, using a play on the peanut butter ad as a demonstration: “Choosy developers choose GIF.” He reiterated the point when he was given a Lifetime Achievement Award at the 2013 Webby Awards. But that has hardly settled the debate, as many others insist on the hard “g” as in the word “gift” but without the “t”. Even dictionaries like Oxford English have unhelpfully declared both pronunciations valid.) Initially, GIFs were used almost exclusively for still images. What made the format revolutionary was a specific compression algorithm, named Lempel-Ziv-Welch for its three creators (Abraham Lemepl, Jacob Ziv and Terry Welch). The way it worked was to identify repeating patterns, then simplify them, allowing for lossless compression of files—meaning none of the data is trimmed in the shortening process. As Eric Limer explains in Popular Mechanics: [LZW] let computers invent a whole new phrase like ‘blite’ pixel for combinations like ‘a blue pixel, a white pixel,’ but also combo-phrases like ‘bliteple’ for ‘blite pixel, purple pixel’ and on and on, cramming more and more information into a single new word. This approach made the GIF uniquely talented at fitting photorealistic color images with their interwoven colors into small and practical packages. Included in the file were multiple variations of the still image, which could be strung together to create a looping video, like a flipbook. The first example of this was a weather map. But when developers took to the World Wide Web in 1991, they mostly used still images. The first color picture online was even a GIF. “GIF soon became a world standard, and also played an important role in the Internet community,” writes software developer Mike Battilana. “Many developers wrote (or acquired under license) software supporting GIF without even needing to know that a company named CompuServe existed.” And therein lay one major problem: because the LZW algorithm that made GIFs possible was actually under patent, owned by a company called Unisys Corp. And in 1995, after years of developers having a free-for-all with their GIFs, suddenly Unisys wanted to make good on their patent. They announced they would be charging a small royalty (.45 percent and .65 percent on different products) for software that used the algorithm, including TIFF and PDF as well as GIF. Their patent wouldn’t run out until 2003 in the U.S. and 2004 everywhere else. Developers’ reactions ranged from the practical—creating a new file format named PNG (at one point named PING for “Ping Is Not Gif”) that didn’t use the LZW algorithm—to the theatrical. On the latter end of this spectrum was “Burn All GIFs” day, held on November 5, 1999, when developers gathered together to delete their GIF files. “Burn All GIFs Day may be the first time in human history that anyone has ever thought it worthwhile to stage an organized political protest, even a small one, over a mathematical algorithm,” wrote The Atlantic at the time. Even though Unisys only asked large companies to buy licenses rather than individual non-commercial users, developers still felt like the patent was a threat. GIF images were largely phased out, especially since other file formats now did a better job when it came to static pictures. But nobody else could fill one niche that GIF had cornered: animated images. And so, even as the Internet evolved beyond early HTML, the scrappy old GIF clung on for dear life. “Before, GIFs were dressing up the content,” says Jason Eppink, curator of digital media at the Museum of Moving Images. GIFs were clip-art images and construction symbols, he explains. But now—“the GIF itself has become the destination.” Part of the reason the GIF survived even after the GIF purge, Eppink thinks, is because it fit the DIY spirit of the early Internet. It was a small file, it could be downloaded and stored on individual servers, and nothing really came along to replace its animation style: that short, continuous, soundless loop. “Like most digital media, it fills a need but it kind of also created the need,” says Kevin Zeng Hu, a Ph.D researcher at the MIT Media Lab. “We all know how unwieldy texting can be and how much context can be lost, especially emotional context. Once you make it visual, you have a higher bandwidth to convey nuance.” Hu partnered with Travis Rich in 2014 to create GIFGIF, a project aimed at quantifying the emotions that come from certain GIFs. The site functions almost as an intentional A-B test, with users being asked to identify which of two GIFs better represents an emotion. To date they’ve received almost 3.2 million responses, and were impressed by the accuracy of the top GIFs for each emotion. In the years since the project began, Hu says GIFs have become better indexed and are more easily usable, thanks to platforms like Giphy. Ironically, today many of the GIFs seen on sites like Twitter and Imgur are actually video files that have been coded to behave like GIFs, simply because new video technology is more efficient than the outdated GIF storage format. “It kind of transcended the file format to become a name for this specific cultural meme,” says Hu. For Eppink, another unique aspect of GIFs is their lack of authorship and how divorced they become from their source material. Just because you’re familiar with a GIF—say, a kid at an old computer giving you a thumbs up—doesn’t mean you have any idea where that animation came from. “Most of the time when excerpts are used, they’re still the property of the thing they came from. There’s something interesting in GIFs in that they become their own entity,” Eppink says. For now, GIFs are protected from copyright claims by fair use doctrine (which protects copying material for limited and transformative purposes), though that protection hasn’t been tested in court. In the past, sports associations like the NFL and the NCAA’s Big 12 conference have sent claims to Twitter about accounts using GIFs of sports events, and the International Olympic Committee unsuccessfully tried to ban GIFs from the 2016 Olympics. Despite the uncertainty over the GIF’s legal future, it’s a cultural icon with staying power. GIFs have even appeared twice at the Museum of the Moving Image. In 2014 they hosted an installation on reaction GIFs, and this June they’ll have another exhibition dedicated to the animated images: a GIF elevator, its walls and ceiling covered in the looping pictures where visitors can be immersed in a single, perpetual moment. “A successful GIF is one that is shared,” Eppink wrote in an article on the history of GIFs for the Journal of Visual Culture. “Even though individuals process the pixels, communities make the GIFs.” Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
4c3370736513b086242fc3994d05d180
https://www.smithsonianmag.com/history/brief-history-nickel-180958941/
A Brief History of the Nickel
A Brief History of the Nickel The nickel wasn't always worth five cents. In 1865, the U.S. nickel was a three-cent coin. Before that, “nickel cents” referred to alloy pennies. It turns out that even the name “nickel” is misleading. “Actually, nickels should be called 'coppers,'” says coin expert Q. David Bowers. Today's so-called nickels are 75 percent copper. Those aren't the only surprises hidden in the history of the nickel. The story of America's five-cent coin is, strangely enough, a war story. And 150 years since it was first minted in 1866, the modest nickel serves as a window into the symbolic and practical importance of coinage itself. To understand how the nickel got its name, you have to go back to an era when precious metals reigned supreme. In the 1850s, coins of any real value were made of gold and silver. In the event of a financial crisis—or worse, the collapse of a government—precious metal coins could always be melted down. They had intrinsic value. But in the spring of 1861, southern states began to secede, and Abraham Lincoln was sworn in as President. Soon shells were falling on Fort Sumter in Charleston, South Carolina. America was in crisis, and so was its currency. “The outcome of the Civil War was uncertain,” says Bowers, an author of several books on coin history. Widespread anxiety led to an important side-effect of war. “People started hoarding hard money, especially silver and gold.” Coins seemed to vanish overnight, and the U.S. Mint couldn't keep up with demand. “The United States literally did not have the resources in gold and silver to produce enough money to meet the needs of the country,” says Douglas Mudd, the director of the American Numismatic Association. “Even the cent was disappearing.” In the South, this problem was even worse. The limited supply of gold and silver was needed to purchase supplies from abroad, which meant the Confederacy relied almost exclusively on paper currency. Minting new coins might not seem like a priority in a time of war. But without coinage, transactions of everyday life—buying bread, selling wares, sending mail—become almost impossible. One Philadelphia newspaper reported that the local economy had slowed to a crawl in 1863, citing that some storekeepers had to cut their prices “one to four cents on each transaction” or refuse to sell products outright because they were unable to get a hold of money. Mudd puts the problem in more familiar terms. “It's like, all of a sudden, not being able to go to 7-Eleven because [the cashier] can't make change,” he says. “And if [they] can't make change, the economy stops.” It was in this economic vacuum that the United States tried a series of monetary experiments. In 1861, the government began paying Union soldiers with “Demand Notes”—also known as “greenbacks.” Meanwhile, stamps were declared legal tender for small purchases; a round metal case was developed to keep them intact. “It looked like a coin with a window on it,” says Mudd. For the duration of the war, the American economy puttered along with all kinds of competing currency. Even private banks and businesses were releasing their own notes and coins. Shopkeepers could give coins, stamps or bills as change. The war finally ended in 1865, but it took many months for precious metals to trickle back into circulation. “It's not until after the Civil War that coin production resumes at full capacity,” says Mudd. As the United States turned its attention to rebuilding, not all metals were scarce. War production had expanded America's industrial capacity, and nickel was available in huge quantities. The advantage of nickel lay in what it wasn't. It wasn't scarce, which meant the government could print millions of coins without creating new shortages. And it wasn't a precious metal, so people wouldn't hoard it. In fact, some cent coins had already been minted using nickel—and as one Pennsylvania newspaper pointed out, “the hoarding of them is unwise and injudicious.” There's no sense in hoarding a coin whose value comes from a government guarantee. Only after a bizarre 1866 controversy about paper money, however, did nickel coins finally conquer everyday life. At the time, the National Currency Bureau (later called the Bureau of Engraving and Printing) was led by a man named Spencer Clark. He was tasked with finding a suitable portrait for the five-cent note. Clark's selection was a proud-looking man with dark eyes and a thick white beard. The public was not amused. “He put his own image on there,” says Mudd. “There was a major scandal.” “Clark put his own head on the currency without any authority whatever,” declared an angry letter to the New York Times. Reporting by the Times depicted Clark's bearded portrait as an assault on the dignity of American money. Another letter-writer chimed in: “It shows the form of impudence in a way seldom attempted before. It is not the first time, however, that men have made a strike for fame, and only achieved notoriety.” While legislators were making speeches in Congress denouncing Clark's portrait, an industrialist named Joseph Wharton was busy prodding legislators to find an alternative to paper money. In the early years of the war, Wharton had bought up nickel mines in New Jersey and Pennsylvania, so his suggestion should come as no surprise. He wanted coins to be made out of nickel. Two months later, five-cent notes were quietly retired. And as Philadelphia's Daily Evening Bulletin reported in May of 1866, a new coin was to immediately take its place. “The President [Andrew Johnson] has approved a bill to authorize the coinage of five cent pieces, composed of nickel and copper,” said the article. “There are to be no more issues of fractional notes of a less denomination than ten cents.” The new coin was decorated with a shield, the words “In God We Trust,” and a large “5,” surrounded by a star and ray design. That year, the government minted a whopping 15 million five-cent nickels—more than 100 times the number of silver half-dimes minted the year before. As far as the future of the nickel was concerned, the timing was perfect. The postwar economy began to gather steam again. “The supply was there, and the demand was there,” says Mudd. “People wanted coins.” The nickel caught on for a few reasons. First of all, after years of coin shortages, nickels flooded the economy. Nearly 30 million were printed in 1867 and 1868. “The nickel was the coin from 1866 to 1876,” says Bowers. Even after that, as dimes and quarters rose in prominence, nickels were the coin of convenience. Bottles of Coca-Cola, which entered the marketplace in 1886, cost a nickel for 73 years. The shield nickel was produced until 1883, when it was replaced due to manufacturing issues by the “Liberty Head” nickel. The decades that followed saw a succession of new designs, starting in 1913 with the Buffalo nickel and followed in 1938 by the initial Jefferson nickel. (Ironically, during World War II, nickel was so essential for war production that nickels were produced without any nickel.) The most recent update, in 2006, revised Jefferson's image from a profile to a frontal portrait. In the 20th century, one other shift cemented the nickel as an indispensable coin of the realm: the rise of coin-operated machines. Nickels were the ideal denomination for vending machines, jukeboxes, and slot machines. It also cost five cents to attend a “nickelodeon”—that is, a nickel theater. (Odeon comes from the Greek word for theater.) “Nickels went into the mainstream,” says Bowers. Nickels have come full-circle since their roots in the gold and silver shortages of the Civil War. One hundred and fifty years ago, coins made of nickel seemed convenient because they were made of cheap metals. These days, nickel and copper prices are high, and our beloved 5-cent coin costs around 8 cents to produce. Maybe it's time to bring back the five-cent note. Daniel A. Gross is a freelance journalist and public radio producer based in Boston.
f09071114fc6f52868aac52504f90386
https://www.smithsonianmag.com/history/brief-history-surveillance-america-180968399/
A Brief History of Surveillance in America
A Brief History of Surveillance in America Brian Hochman assumes this conversation is being recorded. It’s a professional hazard for the Georgetown associate professor of English and American studies. For the last several years, Hochman has been studying electronic surveillance—both the technological developments that have made eavesdropping possible and the cultural and political realities that have made it a part of American life for more than 150 years. “Americans have come to terms with the inconvenient truth that there is no such thing as electronic communication without electronic eavesdropping,” says Hochman, a 2017-2018 National Endowment for the Humanities Public Scholar, who is currently writing a book on the subject. With wiretapping in the headlines and “smart” speakers in millions of homes, we asked Hochman to take us back the early days of eavesdropping and to consider the future of “dataveillance.” How far back do we have to go to find the origins of wiretapping? It starts long before the telephone. The earliest statute prohibiting wiretapping was written in California in 1862, just after the Pacific Telegraph Company reached the West Coast, and the first person convicted was a stock broker named D.C. Williams in 1864. His scheme was ingenious: He listened in on corporate telegraph lines and sold the information he overheard to stock traders. Who’s been doing the eavesdropping? Until the 1920s, wiretapping was most often used by private detectives and corporations. It wasn’t until Prohibition that it became a common law enforcement tool, but even after a 1928 Supreme Court ruling narrowly affirmed the constitutionality of police wiretapping, its legality—and its morality—remained a point of fierce contention. Then, the 1930s brought revelations that wiretapping was a widespread and viciously effective tool for corporate management to root out union activity. The La Follette Civil Liberties Committee in the United States Senate, for instance, found all sorts of wiretap abuses on the part of corporations. Hiring private detectives to spy on labor unions was one of the classic dirty tricks of the period. When did the general public become concerned about issues of wiretapping? It’s only in the 1920s that ordinary Americans start to take notice of wiretapping and it's not really until the 1950s that it's seen as a national problem. Even then, it’s mostly the issue of private wiretapping that concerns people. Wiretapping for hire was extremely common in certain locations, most famously in New York. It was legal, for instance, under murky one-party consent laws to hire an electronic surveillance specialist—known as a “private ear”—to tap your wires to see if your wife is carrying on with another man. Needless to say, the American public was worried about this army of unofficial actors who had the ability and the know-how to tap into the rapidly expanding telephone network. Feelings were mixed about “official” wiretapping. By 1965, the normative political position in the United States was that wiretapping for national security was a necessary evil, whereas wiretapping in the service of the enforcement of criminal law—in, say, tax evasion cases or even in Mafia prosecutions, which was a big priority among American law enforcement starting in the 1960s—was outrageous and an abuse of power. Today, it’s the opposite. Most people are worried about wiretapping by the government. That started with Watergate, when the public saw abuses of wiretapping by the executive branch, and it has spiked again with the Edward Snowden revelations about the National Security Agency. But it’s important to realize that today there are almost two times more warranted wiretaps carried out for criminal investigations than for national security ones. Since wiretapping in criminal investigations disproportionally targets African-Americans and Latinos as part of the “war on drugs,” it isn’t just a civil liberties issue; it’s a civil rights issue. What does the 150-plus-year history of wiretapping reveal about the issue today? There is something categorically different about electronic surveillance in our contemporary moment: the extent to which it operates on a mass scale. Wiretapping and electronic eavesdropping was highly individualized up until the 1980s. We were tapping individual telephones and listening to individual conversations. Now, as a result of the rise of “dataveillence” in particular, we're talking about a scale of surveillance that scarcely seems fathomable from the perspective of the 1960s, 1970s, or even the 1980s. Dataveillance is the tracking of metadata. The NSA does listen to people's conversations, which is what we traditionally think “wiretapping” is, but far more often the NSA tracks the data of those conversations. What's important isn't necessarily what you said on the phone but who you called, when you called, where your phone is, the metadata of your financial transactions—that sort of stuff. They triangulate a million different data points and they can come to a very clear understanding of what has happened. But one of the areas in which there is a continuity from even the earliest days of wiretapping, is the extent to which telecommunications industries are complicit in the rise of a surveillance state and the extent to which surveillance data flows between the telecommunication infrastructure and the infrastructure of American law enforcement. The easiest way for law enforcement to tap wires in the 1920s in the service of the war on alcohol wasn’t to actually go and physically tap a wire but to listen in through the Bell System central exchange. Bell publicly resisted complicity in that arrangement, but that's what happened. It’s the same today. Yet people are willing to let companies eavesdrop on them. Those smart speakers? They are essentially wiretaps. They are constantly listening. It’s a new type of corporate surveillance: If they listen to you, they can get you what you want, when you want. People like that. But where else will that data go? What will happen next? Historians are not in the business of prognostication, but the one thing that I can say with some certainty is that electronic surveillance and dataveillance are going to scale. They will be more global and more instantaneous. I can say with much more certainty that that public attention to these issues will wax and wane. This is one of the things that is so striking about the history of wiretapping in the United States: It has never been a secret, but it’s only every 10 to 15 years that there is a major public scandal surrounding it. There are these brief moments of outrage and then there are these long moments of complacency, like now, and that is one thing that has enabled surveillance to persist in the way that it does. Brimming with nuanced critical insights and unexpected historical connections, Savage Preservation offers a new model for thinking about race and media in the American context—and a fresh take on a period of accelerated technological change that closely resembles our own. This article is a selection from the April issue of Smithsonian magazine April White is a former senior editor for Smithsonian magazine.
9cab3e2dad5a07afaaaf2fe17735c763
https://www.smithsonianmag.com/history/brutal-genocide-colonial-africa-finally-gets-its-deserved-recognition-180957073/
A Brutal Genocide in Colonial Africa Finally Gets its Deserved Recognition
A Brutal Genocide in Colonial Africa Finally Gets its Deserved Recognition As a teenager in the 1960s, Israel Kaunatjike joined the fight against apartheid in his native Namibia. He couldn't have known that his activism would take him across the globe, to Berlin—the very place where his homeland's problems started. Back then, Europeans called Kaunatjike’s home South-West Africa—and it was European names that carried the most weight; tribal names, or even the name Namibia, had no place in the official taxonomy. Black and white people shared a country, yet they weren't allowed to live in the same neighborhoods or patronize the same businesses. That, says Kaunatjike, was verboten. A few decades after German immigrants staked their claim over South-West Africa in the late 19th century, the region came under the administration of the South African government, thanks to a provision of the League of Nations charter. This meant that Kaunatjike's homeland was controlled by descendants of Dutch and British colonists—white rulers who, in 1948, made apartheid the law of the land. Its shadow stretched from the Indian Ocean to the Atlantic, covering an area larger than Britain, France, and Germany combined. “Our fight was against the regime of South Africa,” says Kaunatjike, now a 68-year-old resident of Berlin. “We were labeled terrorists.” During the 1960s, hundreds of anti-apartheid protesters were killed, and thousands more were thrown in jail. As the South African government tightened its fist, many activists decided to flee. “I left Namibia illegally in 1964,” says Kaunatjike. “I couldn't go back.” He was just 17 years old. ********** Kaunatjike is sitting in his living room in a quiet corner of Berlin, the city where he's spent more than half his life. He has a light beard and wears glasses that make him look studious. Since his days fighting apartheid, his hair has turned white. “I feel very at home in Berlin,” he says. Which is a bit ironic, when you consider that in the 1880s, just a few miles from Kaunatjike's apartment, the German Kaiser Wilhelm II ordered the invasion of South-West Africa. This makes his journey a strange sort of homecoming. The battle that Kaunatjike fought as a teen and arguably still fights today, against the cycle of oppression that culminated in apartheid, began with a brutal regime established by the German empire. It ought to be recognized as such—and with help from Kaunatjike, it might. ********** Germans first reached the arid shores of southwestern Africa in the mid-1800s. Travelers had been stopping along the coast for centuries, but this was the start of an unprecedented wave of European intervention in Africa. Today we know it as the Scramble for Africa. In 1884, German chancellor Otto von Bismarck convened a meeting of European powers known as the Berlin Conference. Though the conference determined the future of an entire continent, not a single black African was invited to participate. Bismarck declared South-West Africa a German colony suitable not only for trade but for European settlement. Belgium's King Leopold, meanwhile seized the Congo, and France claimed control of West Africa. The German flag soon became a beacon for thousands of colonists in southern Africa—and a symbol of fear for local tribes, who had lived there for millennia. Missionaries were followed by merchants, who were followed by soldiers. The settlers asserted their control by seizing watering holes, which were crucial in the parched desert. As colonists trickled inland, local wealth—in the form of minerals, cattle, and agriculture—trickled out. Indigenous people didn't accept all this willingly. Some German merchants did trade peacefully with locals. But like Belgians in the Congo and the British in Australia, the official German policy was to seize territory that Europeans considered empty, when it most definitely was not. There were 13 tribes living in Namibia, of which two of the most powerful were the Nama and the Herero. (Kaunatjike is Herero.) Germans were tolerated partly because they seemed willing to involve themselves as intermediaries between warring local tribes. But in practice, their treaties were dubious, and when self-interest benefitted the Germans, they stood by idly. The German colonial governor at the turn of the 20th century, Theodor Leutwein, was pleased as local leadership began to splinter. According to Dutch historian Jan-Bart Gewald, for instance, Leutwein gladly offered military support to controversial chiefs, because violence and land seizure among Africans worked to his advantage. These are all tactics familiar to students of United States history, where European colonists decimated and dispossessed indigenous populations. ********** When Kaunatjike was a child, he heard only fragments of this history. His Namibian schoolteachers taught him that when the Germans first came to southern Africa, they built bridges and wells. There were faint echoes of a more sinister story. A few relatives had fought the Germans, for example, to try and protect the Herero tribe. His Herero tribe. Kaunatjike's roots are more complicated than that, however. Some of his relatives had been on the other side—including his own grandfathers. He never met either of them, because they were both German colonists. “Today, I know that my grandfather was named Otto Mueller,” says Kaunatjike. “I know where he's buried in Namibia.” During apartheid, he explains, blacks were forcibly displaced to poorer neighborhoods, and friendships with whites were impossible. Apartheid translates to “apartness” in Afrikaans. But many African women worked in German households. “Germans of course had relationships in secret with African women,” says Kaunatjike. “Some were raped.” He isn't sure what happened to his own grandmothers. After arriving in Germany, Kaunatjike started to read about the history of South-West Africa. It was a deeply personal story for him. “I was recognized as a political refugee, and as a Herero,” he says. He found that many Germans didn't know their own country's colonial past. But a handful of historians had uncovered a horrifying story. Some saw Germany's behavior in South-West Africa as a precursor of German actions in the Holocaust. The boldest among them argued that South-West Africa was the site of the first genocide of the 20th century. “Our understanding of what Nazism was and where its underlying ideas and philosophies came from,” write David Olusoga and Casper W. Erichsen in their book The Kaiser's Holocaust, “is perhaps incomplete unless we explore what happened in Africa under Kaiser Wilhelm II.” Kaunatjike is a calm man, but there's a controlled anger in his voice as he explains. While German settlers forced indigenous tribes farther into the interior of South West Africa, German researchers treated Africans as mere test subjects. Papers published in German medical journals used skull measurements to justify calling Africans Untermenschen—subhumans. “Skeletons were brought here,” says Kaunatjike. “Graves were robbed.” If these tactics sound chillingly familiar, that's because they were also used in Nazi Germany. The connections don't end there. One scientist who studied race in Namibia was a professor of Josef Mengele—the infamous “Angel of Death” who conducted experiments on Jews in Auschwitz. Heinrich Goering, the father of Hitler's right-hand man, was colonial governor of German South-West Africa. The relationship between Germany's colonial history and its Nazi history is still a matter of debate. (For example, the historians Isabel Hull and Birthe Kundrus have questioned the term genocide and the links between between Nazism and mass violence in Africa.) But Kaunatjike believes that past is prologue, and that Germany's actions in South-West Africa can't be disentangled from its actions during World War II. “What they did in Namibia, they did with Jews,” says Kaunatjike. “It's the same, parallel history.” ********** For the tribes in South-West Africa, everything changed in 1904. Germany's colonial regime already had an uneasy relationship with local tribes. Some German arrivals depended on locals who raised cattle and sold them land. They even enacted a rule that protected Herero land holdings. But the ruling was controversial: many German farmers felt that South-West Africa was theirs for the taking. Disputes with local tribes escalated into violence. In 1903, after a tribal disagreement over the price of a goat, German troops intervened and shot a Nama chief in an ensuing scuffle. In retaliation, Nama tribesmen shot three German soldiers. Meanwhile, armed colonists were demanding that the rule protecting Herero land holdings be overturned, wanting to force Herero into reservations. Soon after, in early 1904, the Germans opened aggressive negotiations that aimed to drastically shrink Herero territory, but the chiefs wouldn't sign. They refused to be herded into a small patch of unfamiliar territory that was badly suited for grazing. Both sides built up their military forces. According to Olusoga and Erichsen’s book, in January of that year, two settlers claimed to have seen Herero preparing for an attack—and colonial leaders sent a telegram to Berlin announcing an uprising, though no fighting had broken out. It isn't clear who fired the first shots. But German soldiers and armed settlers were initially outnumbered. The Herero attacked a German settlement, destroying homes and railroad tracks, and eventually killing several farmers. When Berlin received word of the collapse of talks—and the death of white German subjects—Kaiser Wilhelm II sent not only new orders but a new leader to South-West Africa. Lieutenant General Lothar von Trotha took over as colonial governor, and with his arrival, the rhetoric of forceful negotiations gave way to the rhetoric of racial extermination. Von Trotha issued an infamous order called the Vernichtungsbefehl—an extermination order. “The Herero are no longer German subjects,” read von Trotha's order. “The Herero people will have to leave the country. If the people refuse I will force them with cannons to do so. Within the German boundaries, every Herero, with or without firearms, with or without cattle, will be shot. I won’t accommodate women and children anymore. I shall drive them back to their people or I shall give the order to shoot at them.” German soldiers surrounded Herero villages. Thousands of men and women were taken from their homes and shot. Those who escaped fled into the desert—and German forces guarded its borders, trapping survivors in a wasteland without food or water. They poisoned wells to make the inhuman conditions even worse—tactics that were already considered war crimes under the Hague Convention, which were first agreed to in 1899. (German soldiers would use the same strategy a decade later, when they poisoned wells in France during World War I.) In the course of just a few years, 80 percent of the Herero tribe died, and many survivors were imprisoned in forced labor camps. After a rebellion of Nama fighters, these same tactics were used against Nama men, women, and children. In a colony where indigenous people vastly outnumbered the thousands of German settlers, the numbers are staggering: about 65,000 Herero and 10,000 Nama were murdered. Images from the period make it difficult not to think of the Holocaust. The survivors’ chests and cheeks are hollowed out from the slow process of starvation. Their ribs and shoulders jut through their skin. These are the faces of people who suffered German rule and barely survived. This is a history that Kaunatjike inherited. ********** German colonial rule ended a century ago, when Imperial Germany lost World War I. But only after Namibia gained independence from South Africa in 1990 did the German government really begin to acknowledge the systematic atrocity that had happened there. Although historians used the word genocide starting in the 1970s, Germany officially refused to use the term. Progress has been slow. Exactly a century after the killings began, in 2004, the German development minister declared that her country was guilty of brutality in South West Africa. But according to one of Kaunatjike's fellow activists, Norbert Roeschert, the German government avoided formal responsibility. In a striking contrast with the German attitude toward the Holocaust, which some schoolteachers start to cover in the 3rd grade, the government used a technicality to avoid formally apologizing for genocide in South-West Africa. “Their answer was the same over the years, just with little changes,” says Roeschert, who works for the Berlin-based nonprofit AfrikAvenir. “Saying that the Genocide Convention was put in place in 1948, and cannot be applied retroactively.” For activists and historians, Germany’s evasiveness, that genocide wasn’t yet an international crime in the early 1900s, was maddening. Roeschert believes the government avoided the topic on pragmatic grounds, because historically, declarations of genocide are closely followed by demands for reparations. This has been the case with the Holocaust, the Armenian Genocide, and the Rwandan Genocide. Kaunatjike is a witness and an heir to Namibia's history, but his country's story been doubly neglected. First, historical accounts of apartheid tend to place overwhelming emphasis on South Africa. Second, historical accounts of genocide focus so intently on the Holocaust that it's easy to forget that colonial history preceded and perhaps foreshadowed the events of World War II. This might finally be changing, however. Intense focus on the centennial of the Armenian Genocide also drew attention to brutality in European colonies. A decade of activism helped change the conversation in Germany, too. Protesters in Germany had some success pressuring universities to send Herero human remains back to Namibia; one by one, German politicians began talking openly about genocide. Perhaps the greatest breakthrough came this summer. In July, the president of the German parliament, Norbert Lammert, in an article for the newspaper Die Zeit, described the killing of Herero and Nama as Voelkermord. Literally, this translates to “the murder of a people”—genocide. Lammert called it a “forgotten chapter” in history that Germans have a moral responsibility to remember. “We waited a long time for this,” says Kaunatjike. “And that from the mouth of the president of the Bundestag. That was sensational for us.” “And then we thought—now it really begins. It will go further,” Kaunatjike says. The next step is an official apology from Germany—and then a dialogue between Namibia, Germany, and Herero representatives. Germany has so far balked at demands for reparations, but activists will no doubt make the case. They want schoolchildren to know this story, not only in Germany but also in Namibia. For Kaunatjike, there are personal milestones to match the political ones. 2015 year marks 25 years of Namibian independence. In November, Kaunatjike plans to visit his birthplace. “I want to go to my old village, where I grew up,” he says. He'll visit an older generation of Namibians who remember a time before apartheid. But he also plans to visit his grandfather's grave. He never met any of his German family, and he often wonders what role they played in the oppression of Namibians. When Kaunatjike's journey started half a century ago, the two lines of his family were kept strictly separate. As time went on, however, his roots grew tangled. Today he has German roots in Namibia and Namibian roots in Germany. He likes it that way. Kaunatjike sometimes wishes he spent less time on campaigns and interviews, so he'd have more time to spend with his children. But they're also the reason he's still an activist. “My children have to know my story,” he says. He has grandchildren now, too. Their mother tongue is German. And unlike Kaunatjike himself, they know what kind of a man their grandfather is. Daniel A. Gross is a freelance journalist and public radio producer based in Boston.
35ea6a82d6c7a904a6ec2a2ef64b5bac
https://www.smithsonianmag.com/history/buried-ash-vesuvius-scrolls-are-being-read-new-xray-technique-180969358/
Buried by the Ash of Vesuvius, These Scrolls Are Being Read for the First Time in Millennia
Buried by the Ash of Vesuvius, These Scrolls Are Being Read for the First Time in Millennia It’s July 12, 2017, and Jens Dopke walks into a windowless room in Oxfordshire, England, all of his attention trained on a small, white frame that he carries with both hands. The space, which looks like a futuristic engine room, is crowded with sleek metal tables, switches and platforms topped with tubes and boxes. A tangle of pipes and wires covers the walls and floor like vines. In the middle of the room, Dopke, a physicist, eases the frame into a holder mounted on a metal turntable, a red laser playing on the back of his hand. Then he uses his cellphone to call his colleague Michael Drakopoulos, who is sitting in a control room a few yards away. “Give it another half a millimeter,” Dopke says. Working together, they adjust the turntable so that the laser aligns perfectly with a dark, charred speck at the center of the frame. Dozens of similar rooms, or “hutches,” are arrayed around this huge, doughnut-shaped building, a type of particle accelerator called a synchrotron. It propels electrons to near light speed around its 500-meter-long ring, bending them with magnets so they emit light. The resulting radiation is focused into intense beams, in this case high-energy X-rays, which travel through each hutch. That red laser shows the path the beam will take. A thick lead shutter, attached to the wall, is all that stands between Dopke and a blast of photons ten billion times brighter than the Sun. The facility, called Diamond Light Source, is one of the most powerful and sophisticated X-ray facilities in the world, used to probe everything from viruses to jet engines. On this summer afternoon, though, its epic beam will focus on a tiny crumb of papyrus that has already survived one of the most destructive forces on the planet—and 2,000 years of history. It comes from a scroll found in Herculaneum, an ancient Roman resort on the Bay of Naples, Italy, that was buried by the eruption of Mount Vesuvius in A.D. 79. In the 18th century, workmen employed by King Charles III of Spain, then in charge of much of southern Italy, discovered the remains of a magnificent villa, thought to have belonged to Lucius Calpurnius Piso Caesoninus (known as Piso), a wealthy statesman and the father-in-law of Julius Caesar. The luxurious residence had elaborate gardens surrounded by colonnaded walkways and was filled with beautiful mosaics, frescoes and sculptures. And, in what was to become one of the most frustrating archaeological discoveries ever, the workmen also found approximately 2,000 papyrus scrolls. The scrolls represent the only intact library known from the classical world, an unprecedented cache of ancient knowledge. Most classical texts we know today were copied, and were therefore filtered and distorted, by scribes over centuries, but these works came straight from the hands of the Greek and Roman scholars themselves. Yet the tremendous volcanic heat and gases spewed by Vesuvius carbonized the scrolls, turning them black and hard like lumps of coal. Over the years, various attempts to open some of them created a mess of fragile flakes that yielded only brief snippets of text. Hundreds of the papyri were therefore left unopened, with no realistic prospect that their contents would ever be revealed. And it probably would have remained that way except for an American computer scientist named Brent Seales, director of the Center for Visualization & Virtual Environments at the University of Kentucky. Seales is in the control room now, watching intently: frowning, hands in pockets, legs wide. The papyrus scrap in the white frame, held between two layers of transparent orange film, is just three millimeters across, and sports one barely visible letter: an old-fashioned Greek character called a lunate sigma, which looks like a lowercase “c.” Next to the turntable, shielded inside a tungsten tube, is a high-resolution X-ray detector, called HEXITEC, that has taken engineers ten years to develop. Seales believes that it will pick up the desperately faint signal he’s looking for and, in doing so, “read” the tiny Greek letter. “When I started thinking about this, this technology didn’t exist,” he says. “I don’t think there’s another detector in the world right now that could do this kind of measurement.” If it works, imaging the single letter on this charred crumb could help to unlock the secrets of the entire library. A wailing alarm sounds as Dopke exits the hutch before Drakopoulos swings shut the 1,500-pound, lead-lined door. Back in the control room, computer screens show a live feed of the papyrus from multiple angles as Drakopoulos clicks his mouse to raise the shutter and flood the hutch with radiation. Sitting next to him, an engineer prepares to capture data from the detector. “Ready?” he asks. “I’m going to press Play.” ********** Seales, who is 54, has wide-set eyes beneath a prominent brow, and an air of sincere and abiding optimism. He’s an unlikely pioneer in papyrus studies. Brought up near Buffalo, New York, he has no training in the classics. While European curators and textual scholars yearn to discover lost works of classical literature in the Herculaneum scrolls, Seales, an evangelical Christian, dreams of finding letters written by the apostle Paul, who was said to have traveled around Naples in the years before Vesuvius erupted. Seales came of age in the 1970s and ’80s—the era of early video games, when big-dreaming Californians were building computers in their garages—and he was a techie from a young age. With no money for college, but with a brain for complex mathematics and music (he played violin at his local church), Seales won a double scholarship from the University of Southwestern Louisiana to study computer science and music. Later, while earning his doctorate, at the University of Wisconsin, he became fascinated with “computer vision,” and began writing algorithms to convert two-dimensional photographs into 3-D models—a technique that later enabled vehicles such as Mars rovers, for example, to navigate terrain on their own. Seales went to work at the University of Kentucky in 1991, and when a colleague took him along to the British Library to photograph fragile manuscripts, Seales, captivated by the idea of seeing the unseeable, found the challenge thrilling. The British Library project was part of a “digital renaissance” in which millions of books and hundreds of thousands of manuscripts were photographed for posterity and stored online. Seales helped make a digital version of the only surviving copy of the Old English epic poem Beowulf, using ultraviolet light to enhance the surviving text. But working with the warped, cockled pages made him realize the inadequacy of two-dimensional photographs, in which words can be distorted or hidden in creases and folds. So in 2000, he created three-dimensional computer models of the pages of a damaged manuscript, Otho B.x (an 11th-century collection of saints’ lives), then developed an algorithm to stretch them, producing an artificial “flat” version that didn’t exist in reality. When that worked, he wondered if he could go even further, and use digital imaging not just to flatten crinkled pages but to “virtually unwrap” unopened scrolls—and reveal texts that hadn’t been read since antiquity. “I realized that no one else was doing this,” he says. He began to experiment with a medical-grade computed tomography (or CT) scanner, which uses X-rays to create a three-dimensional image of an object’s internal structure. First, he tried imaging the paint on a modern rolled-up canvas. Then he scanned his first authentic object—a 15th-century bookbinding thought to contain a fragment of Ecclesiastes hidden inside. It worked. Buoyed by his success, Seales imagined reading fragments of the Dead Sea Scrolls, which include the oldest biblical writings ever found, dating to as far back as the third century B.C., sections of which remain unopened today. Then, in 2005, a classicist colleague took him to Naples, where many of the excavated Herculaneum scrolls are displayed at the National Library, a few steps from a window with a view across the bay to Vesuvius itself. Seared by gases at hundreds of degrees centigrade and superheated volcanic materials that in time hardened into 60 feet of rock, the distorted, crumbling rolls were believed by most scholars to be the very definition of a lost cause. For Seales, viewing them was an “almost otherworldly” experience, he says. “I realized that there were many dozens, probably hundreds, of these intact scrolls, and nobody had the first idea about what the text might be. We were looking at manuscripts that represent the biggest mysteries that I can imagine.” ********** He isn’t the first to try to solve these mysteries. In 1752, when Charles III’s workmen found the carbonized lumps inside what’s now known as the Villa dei Papiri, they assumed they were pieces of coal and burned them or threw them in the sea. But once they were identified as scrolls, Camillo Paderni, an artist in charge of the recovered antiquities, set about opening the remaining ones. His method involved slicing the rolls in half, copying any visible text, then scraping away each layer in turn to reveal what was beneath. Hundreds of rolls were transcribed that way—and destroyed in the process. In 1754, a Vatican priest and conservator named Antonio Piaggio dreamed up a new scheme: He glued goldbeater’s skin (a calf’s extremely thin yet tough intestinal membrane) to a scroll’s surface, then used a contraption involving weights on strings to ease it open. Artists watched this excruciatingly slow process and copied any exposed writing in pencil sketches known as disegni. Many of the flaky outer layers of the scrolls were removed before the inner portion could be unwound, and the papyrus often tore off in narrow strips, leaving layers stuck together. Hundreds of scrolls were pulled apart using Piaggio’s machine, but they revealed only limited text. Scholars searching the transcribed fragments for lost works of literature have largely been disappointed. A few pieces of Latin works were discovered, including parts of the Annales, by Quintus Ennius, a second-century B.C. epic poem about the early history of Rome, and Carmen de bello Actiaco, which tells of the final hours of Antony and Cleopatra. The vast majority of the opened scrolls contained Greek philosophical texts, relating to the ideas of Epicurus, an Athenian philosopher in the late fourth and early third centuries B.C., who believed that everything in nature is made up of atoms too small to see. Some are by Epicurus himself, such as a piece of On Nature, a huge work that was previously known but lost. But most are by Philodemus, an Epicurean employed by Piso in the first century B.C., and cover Epicurus’ views on ethics, poetry and music. None of the Herculaneum scrolls has been opened since the 19th century, and scholars have instead focused on squeezing information out of the already-revealed texts. A step forward came in the 1980s, when Dirk Obbink of Oxford University and Daniel Delattre of France’s National Center for Scientific Research independently worked out how to reassemble fragments dissected under Paderni. In the 1990s, Brigham Young University researchers photographed the surviving opened papyri using multispectral imaging, which deploys a range of wavelengths of light to illuminate the text. Infrared light, in particular, increased the contrast between the black ink and dark background. That was a “huge breakthrough,” says Obbink. “It enabled us to read vastly more of the unrolled rolls.” The new images triggered a wave of scholarship into Epicurean philosophy, which had been poorly understood compared with the rival ideas of Plato, Aristotle or the Stoics. But the texts were still incomplete. The beginnings of all the manuscripts remain missing. And the prose is often scrambled, because letters and words from different layers of a scroll wound up next to one another in two-dimensional renderings. “What we’d really like to do,” says Obbink, “is to read a text from beginning to end.” That was thought impossible, until Seales saw the scrolls in Naples and realized that his research had been leading to exactly this grand challenge. “I thought, I’m a year away,” Seales says. “All I have to do is get access to the scrolls, and we can solve this.” That was 13 years ago. ********** Seales vastly underestimated, among other things, the difficulty of getting permission even to study the scrolls. Conservators are understandably reluctant to hand out these terribly fragile objects, and the library in Naples refused Seales’ requests to scan one. But a handful of Herculaneum papyri ended up in England and France, as gifts from Ferdinand, son of Charles III and King of Naples and Sicily. Seales collaborated with Delattre and the Institut de France, which has six scrolls in its possession. Two of the scrolls are in hundreds of pieces after past attempts to open them, and Seales eventually received permission to study three small fragments. The first problem he hoped to solve was how to detect ink hidden inside rolled-up scrolls. From the late third century A.D. onward, ink tended to include iron, which is dense and easy to spot in X-ray images. But the papyri found at Herculaneum, created before A.D. 79, were written with ink made primarily of charcoal mixed with water, which is extremely difficult to distinguish from the carbonized papyrus it sits on. At his lab in Kentucky, Seales subjected the papyrus scraps to a battery of noninvasive tests. He looked for trace elements in the ink—anything that might show up in CT—and discovered tiny amounts of lead, perhaps contamination from a lead inkwell or water pipe. It was enough for the Institut de France to give him access to two intact papyri: blackened sausage-shaped artifacts that Seales nicknamed “Banana Boy” and “Fat Bastard.” Seales arranged for a 600-pound high-resolution CT scanner to be sent by truck from Belgium, and he made intricately detailed scans of the scrolls. But after months of analyzing the data, Seales was disheartened to find that the ink inside the scrolls, despite the traces of lead, was invisible. The calamity that proved lethal for Pompeii inhabitants preserved the city for centuries, leaving behind a snapshot of Roman daily life that has captured the imagination of generations, including Renoir, Freud, Hirohito, Mozart, Dickens, Twain, Rossellini, and Ingrid Bergman. Interwoven is the thread of Rowland’s own impressions of Pompeii. What was worse, the scans showed the layers inside the scrolls to be so carbonized that in many places there was no detectable separation between them. “It was just too complicated for our algorithms,” Seales admits. He played me a video of the CT scan data, showing one of the scrolls in cross-section. The whorls of papyrus glowed white against a dark background, like closely wound strands of silk. “Just take a look at that,” said Seales. “This is when we knew we were doomed for the present time.” What makes virtual unwrapping such a complex challenge is that, even if you imaged the inside of a rolled-up scroll written in ink that glowed brightly in scans, you would still only see a dizzying mess of tightly packed letters floating in space, like a three-dimensional jigsaw puzzle—but without a final picture to use as a guide. To decipher that jumble of letters, Seales’ key innovation was to develop software to locate and model the surface layer within a wound-up scroll, which analyzes each point in as many as 12,000 cross-sections. Then he looks for density changes that correspond to the ink, and applies filters or other techniques to increase the contrast of the letters as much as possible. The final step is to figuratively “unroll” the image for reading. Seales spent 2012 and 2013 as a visiting scientist at the Google Cultural Institute in Paris, amping up his algorithms to cope with the complex structures the CT scans had revealed. He got the chance to try his new approach soon afterward, when Pnina Shor, at the Israel Antiquities Authority, or IAA, in Jerusalem, contacted him about a carbonized roll of parchment found in the ancient town of Ein Gedi, on the western shore of the Dead Sea. The scroll was excavated from the remains of a synagogue, which was destroyed by fire in the sixth century A.D. The charred, cigar-shaped lump was far too fragile to open, but Israeli researchers had recently CT-scanned it. Would Seales take a look at the data? Shor handed over a hard drive, and Seales and his colleagues went to work. In the meantime, Seales was chasing a new idea for reading carbon-based ink: X-ray phase-contrast tomography, a highly sensitive form of imaging that can detect subtle density changes in a material—the kind that might result from applying ink to papyrus—by measuring the changing intensity of the beam as it passes through an object. Only a large particle accelerator, though, can produce such a beam. One of the nearest was Synchrotron Soleil, outside Paris. Seales’ request for “beam time” there was rejected, but he and Delattre were subsequently approached by an Italian physicist named Vito Mocella, who had close ties to another synchrotron in Grenoble, in southeastern France. Seales provided custom-designed cases for the scrolls, built using data from his CT scans, but his schedule didn’t allow him to travel. So in December 2013, Delattre took Banana Boy and another scroll to Grenoble without him.* Seales waited eagerly for the promised data, but the files did not arrive. Then, in January 2015, Mocella’s group published the results without him. It was, Seales says, an “excruciatingly frustrating” experience. “I believed we were collaborating, until I realized that the feeling was not mutual.” News stories around the world reported that Herculaneum scrolls had been deciphered at last. But, in fact, Mocella had claimed to read only letters, and some scholars are cautious about even those, not least because the group did not publish enough information for others to replicate the analysis. Mocella finally shared his data with Seales and others after publication. After reviewing it, Seales concluded that the findings were a bust. “The dataset did not produce any contrast at the ink,” he told me. Seales thinks the researchers, who were without software to model the surfaces within the scrolls, were seeing “ghosts”—random patterns in the papyrus’ fiber structure that just happen to look like letters. He is now convinced that phase-contrast tomography alone is not sufficient to read the Herculaneum scrolls in any meaningful way. (Mocella insists the letters he saw were real, and he took issue with Seales’ version of the incident. “From my point of view, I and my team are still working with Brent, since we’ve given him, as with other specialists like him, most of the scans,” Mocella said.) By that point Seales had finished a preliminary analysis of the Ein Gedi scroll, and in July 2015 he and the IAA announced their results. “We absolutely hit a home run,” Seales says. Unlike the authors of the Herculaneum scrolls, the Hebrew scribes had mixed metals into their ink. Seales’ software correctly mapped the letters to the rolled-up parchment, then virtually unfurled it, revealing all of the surviving text, in perfect sequence, on each of the five wraps of the scroll. There were 35 lines of text in two columns, composed of Hebrew letters just two millimeters tall. Israeli researchers identified the text as the first two chapters of the Book of Leviticus, dating to the third or fourth century A.D. It was a hugely significant find for biblical scholars: the oldest extant copy of the Hebrew Bible outside of the Dead Sea Scrolls, and a glimpse into the history of the Bible during a period from which hardly any texts survive. And it was proof that Seales’ method worked. Following Mocella’s publication, however, the Institut de France refused further access to its Herculaneum scrolls. Which is why Seales turned his attention to Oxford. ********** The Bodleian Libraries, at Oxford University, possess four Herculaneum scrolls, which arrived in 1810, after they were presented to the Prince of Wales. They are kept deep inside the building, in a location so secret that even David Howell, the Bodleian’s head of heritage science, says he doesn’t know where it is. Seales wasn’t permitted to see the intact papyri, never mind scan them. But one of the four, known as “P.Herc. 118,” was sent to Naples in 1883, to be unrolled using Piaggio’s machine. It came back as a mosaic of crumbs, which were glued onto tissue paper and mounted behind glass in 12 wood frames. The text appears to be a history of Epicurean philosophy, probably by Philodemus, but it has been particularly challenging for scholars to interpret. A fragment might seem covered with continuous lines of writing, says Obbink, “but really every inch you’re jumping up or down a layer.” To prove the value of his approach, Seales asked the Bodleian to let him analyze P.Herc. 118. If all went well, he hoped, he might get a shot at scanning the intact scrolls later. “We wouldn’t necessarily have chosen to get involved, except for Brent’s enthusiasm,” says Howell. So in July 2017, the 12 frames were removed from storage and taken to Howell’s third-floor office—something of a coup for Seales, given their invaluable nature. Cheerful and ruddy-faced, Howell has worked in conservation for close to 35 years, and even he felt daunted as the protective glass frames were removed, exposing the fragile papyrus beneath. “These are the most terrifying objects I’ve ever handled,” he says. “If you sneeze, they’d blow away.” Seales and another colleague scanned these scroll fragments using a hand-held 3-D scanner called an Artec Space Spider. Meanwhile, Howell carried out hyperspectral imaging, which uses hundreds of wavelengths of light. Howell listened to Pink Floyd through noise-canceling headphones to escape the grinding noise of the scanner, he says, plus the knowledge that if anything went wrong, “I might as well pack my bags and go home and not come back.” After Seales returned to Kentucky, he and his colleagues spent months mapping all of the available 2-D images onto the 3-D template produced by the Artec Space Spider. This past March, they returned to Oxford to present the results on a big screen to a packed conference room. At such a high resolution, the charred papyrus resembled a dark-brown mountain range as seen from above, with lines of text snaking over the ridges and peaks. There was a gasp from the audience as Seales’ student Hannah Hatch rotated the image, then zoomed into creases and peeked over folds, flipping seamlessly between high-resolution photographs, infrared images and even the disegni drawings—all matched up to the 3-D template. Shortly afterward, James Brusuelas, an Oxford papyrologist working with Seales, revealed several new details visible in the scans, such as the name Pythocles, who was a young follower of Epicurus. More important, Brusuelas was able to decipher the column structure of the text—17 characters per line—which will be crucial for reading the rest of the roll, particularly when trying to join different fragments together. “We have the basic information we need to put Humpty Dumpty back together again,” he said. The audience buzzed with questions and applause. It was the reaction Seales was hoping for, and a step toward his real goal—gaining access to intact scrolls. He’d saved his own presentation until last. It wasn’t about P.Herc. 118, but rather one tiny letter: the lunate sigma. ********** Driving south from the stone archways and quadrangles of Oxford, the road soon cuts through flat green fields reaching to the horizon. On the day I visited, fork-tailed red kites hovered high in the blue July sky. After 15 or so miles a sprawling campus of low gray buildings came into view. At first, it resembled an ordinary industrial park, until I noticed the names of the roads: Fermi, Rutherford, Becquerel, all giants of 19th- and 20th-century physics. Behind a wire fence a huge, silver dome, more than a quarter-mile in circumference, rose from the grass like a giant flying saucer. This was Diamond Light Source, and Seales was waiting inside. He’d brought a speck of charred papyrus from one of the Herculaneum scrolls he studied a decade earlier. The ink on it, he had found, contained a trace of lead. In Grenoble, direct X-ray imaging of the scrolls had not been enough to detect the ink. But when you fire hugely powerful X-rays through lead, the metal emits electromagnetic radiation, or “fluoresces,” at a characteristic frequency. Seales hoped to pick up that signal with a detector placed beside the fragment, which was specially calibrated to capture photons at lead’s characteristic frequency. It was a long shot. The minuscule fluorescence of the letter would be swamped by radiation from the protective lead lining the room—like looking for a flickering candle from miles away on a rainy night, Seales said, as we stood in the crowded hutch. But after several days of intense work—optimizing the angle of the detector, shielding the main X-ray beam with tungsten “flight tubes”—the team finally got what it was looking for: a grainy, but clearly recognizable, “c.” “We’ve proven it,” Seales said in triumph as he displayed the legible image to the Oxford audience in March. It is, Seales hopes, the last piece of the puzzle he needs to read the ink inside a Herculaneum scroll. The results have scholars excitedly re-evaluating what they might now be able to achieve. “I think it’s actually very close to being cracked,” says Obbink, the Oxford papyrologist. He estimates that at least 500 Herculaneum scrolls haven’t been opened. Moreover, excavations at Herculaneum in the 1990s revealed two unexplored layers of the villa, which some scholars believe may contain hundreds or even thousands more scrolls. Many scholars are convinced that Piso’s great library must have contained a range of literature far wider than what has been documented so far. Obbink says he wouldn’t be surprised to find more Latin literature, or a once-unimaginable treasure of lost poems by Sappho, the revered seventh-century B.C. poet known today only through the briefest of fragments. Michael Phelps, of the Early Manuscripts Electronic Library, in California, who recently used multispectral imaging to reveal dozens of hidden texts on reused parchment at St. Catherine’s Monastery, in Egypt, calls Seales’ methods “revolutionary.” Scholars have long faced a choice between attempting to read concealed texts (and potentially destroying them in the process) or conserving them unread. “Brent Seales’ technology is removing that dilemma,” Phelps says. Successfully reading Herculaneum scrolls could trigger a new “renaissance of classical antiquity,” says Gregory Heyworth, a medievalist at the University of Rochester in New York. He points out that virtual unwrapping could be applied to countless other texts. In Western Europe alone, he estimates, there are tens of thousands of manuscripts dating from before A.D. 1500—from carbonized scrolls to book covers made from older, glued-together pages—that could benefit from such imaging. “We’d change the canon,” Heyworth says. “I think the next generation is going to have a very different picture of antiquity.” ********** Seales has lately been enhancing his technique, by using artificial intelligence to train his software to recognize subtle differences in texture between papyrus and ink. He plans to combine such machine learning and X-ray fluorescence to produce the clearest possible text. In the future, “it’ll all be automated,” he predicts. “Put it in the scanner and it will all just unfurl.” Seales is still negotiating with curators in Oxford, Naples and Paris for access to intact scrolls. He has surmounted huge technical hurdles, but the complex political challenge of navigating the gatekeepers, winning beam time at particle accelerators and lining up funding can, very occasionally, puncture his optimism. “How does a guy like me make all that stuff happen all at once?” he said in one such moment. He shrugged and looked around him. “It’s more than a computer scientist is really capable of doing.” Then belief returned to his wide, hazel eyes. “I refuse to accept that it’s not possible,” he said. “At every turn, there has been something that opened up.” Reading a complete intact scroll at last, he went on, would be “like returning home to your family, who have been waiting all along for you to do the thing you started.” *Editor's Note: This article was updated to correct the name of the French research facility that declined Seales' proposal to scan a Herculaneum scroll, and to clarify how the scrolls were ultimately scanned at Grenoble. This article is a selection from the July/August issue of Smithsonian magazine Henrik Knudsen is a London-based photographer. Jo Marchant is an award-winning science journalist and former editor at New Scientist and Nature. She is the author of <"https://amzn.to/3f6i9rW">The Human Cosmos: Civilization and the Stars
d58f8f54c35def1a3f9fc26311aa0cb9
https://www.smithsonianmag.com/history/cannibal-club-racism-and-rabble-rousing-victorian-england-180952088/
The Cannibal Club: Racism and Rabble-Rousing in Victorian England
The Cannibal Club: Racism and Rabble-Rousing in Victorian England Bertolini's restaurant was cheap, but charming: perfect for the creatures who roamed 19th-century London after the sun went down. On Tuesday nights, in Bertolini's backroom, respected judges and doctors, esteemed lawyers, admired politicians and award-winning poets and writers drank heavily, smoked cigars and secretly discussed what they thought they knew of the British colonies, more specifically polygamy, bestiality, phallic worship, female circumcision, ritual murder, savage fetishes and island cannibalism. The gentlemen would trade in exotic pornography and tales of flogging and prostitution. If, by chance, a pious, God-fearing bloke were to accidentally stumble into the Fleet Street backroom on a Tuesday night, the tips of his Victorian moustache would've certainly stood on end. Before the debate between science and creationism, there was the debate between monogenism and polygenism. Monogenists believed that all of humanity shared a common ancestry while polygenists were convinced that different races of man had different origins. There was a palpable tension in Victorian England between the creation of a democratic scientific methodology and the elitist attitudes that reinforced Anglo-Saxon superiority. During Britain's "Imperial Century" these convenient human classifications were perfectly inline with colonialist sensibilities—of course, no race could match the enlightenment of the English Gentleman. The conflict captured the imagination of Victorian England and, by 1863, drove a wedge between the polygenist and monogenist members of the then 20-year-old Ethnological Society of London. Determined to continue advocating their polygenist ideologies, Captain Richard Francis Burton and Dr. James Hunt, both members of the Ethnological Society of London, broke away and established The Anthropological Society of London. The new splinter society supported the pseudoscientific practices of phrenology, the measuring of skull size with craniometers and, of course, polygenism. Recent scholarship has even suggested that its members were covert propagandists, acting on behalf of the Confederate States of America to convince Londoners that enslaved Africans were biologically incapable of any development beyond their menial work as slaves. From the intellectual ferment of the Anthropological Society's inaugural year grew an even more exclusive and overtly seditious conclave of high-society rebels: a gentlemen's dining group called the Cannibal Club. Though Hunt was the president of the Anthropological Society, Burton, who possessed a Byronic love for shocking people, was to be the mastermind behind the new hush-hush fraternity. An experienced geographer and explorer, a writer and translator who spoke 29 languages, a decorated captain in the army of the East India Company and renowned cartographer, Richard Francis Burton was also considered by some to be a rogue, a murderer, an impostor and betrayer, a sexual deviant, and a heroic boozer and brawler. He was six feet tall with a barrel chest and an imposing scar on his left cheek. He was famous for infiltrating Mecca in 1853, disguised as an Arab merchant and for translating the raw, unexpurgated texts of erotic Eastern literature such as the Kama Sutra and the Arabian Nights. He was presented to the Queen and he dined with the Prime Minister. When asked by a young vicar if he'd ever killed a man, Burton replied cooly, "Sir, I'm proud to say that I have committed every sin in the Decalogue." Burton was one of Hell's original hounds and the Cannibal Club was his sanctuary. Sitting around in stovepipe top hats, tailored frocks and loosened cravats in the banquet room of Bertolini's, the members would be called to order with a strike of Burton's gavel. The gavel, naturally, was a piece of wood carved in the likeness of their official symbol: a mace drawn to resemble an African head gnawing on a thighbone. And before launching into one of their raucous powwows, a member would stand and recite the club's Cannibal Catechism: an anthem of sorts that purposely mocked the Christian sacrament of the Eucharist, likening it to a cannibal feast. The opening stanza of the invocation, written by Cannibal Club mainstay, respected playwright and decadent poet, Algernon Charles Swineburne, depicts just how profoundly blasphemous and anti-clerical the group was: Preserve us from our enemies; Thou who art Lord of suns and skies; Whose meat and drink is flesh in pies; And blood in bowls! Of thy sweet mercy, damn their eyes; And damn their souls! Swineburne, a short and fragile man with a  little weasel-like mouth, was perhaps one of the clubs most debauched members. As a suicidal algolagniac, alcoholic and habitué of London's flagellant brothels, Swineburne also contributed to the eminent 11th Edition of the Encyclopedia Britannica and was nominated for the Nobel Prize in Literature every year from 1903 to 1907 and again in 1909. After Swineburne's Catechism was recited, the members would "eat, drink, and let their conversations veer absolutely wherever they wanted", writes Monte Reel in Between Man and Beast. "The members were drawn to one another thanks to a shared hatred for one 'Mrs. Grundy'—a fictional composite who epitomized the tight-laced prudery that threatened to define the era." Needless to say, no minutes were kept during the meetings. Cannibal Club members were culture-warriors. They were generally sympathetic to all religions yet loyal to none. They were unapologetic hedonists and scientific racists. They exhibited an unbridled interest in the various expressions of human sexuality and saw sexual repression as a national crisis. Another central figure of the club was Charles Bradlaugh, a political activist, renowned atheist and the founder of the National Secular Society. Bradlaugh was a pamphleteer who openly published information on land reform and birth control. In 1880, when Bradlaugh was elected into Parliament, he refused to take the religious oath—an act for which he was briefly imprisoned in a cell beneath Big Ben. In 1891, his funeral was attended by 3,000 people including a then 21-year-old Mohandas Gandhi. Another cornerstone of the club was Baron Monckton Milnes, a poet, patron of literature and politician. Milnes' unrivalled private collection of pornography, which was known to few during his lifetime, now sits in the British Library. English author, Jean Overton, contends that Milnes was the author of The Rodiad, an unattributed pornographic poem published in 1871 about a schoolmaster who derives pleasure from flogging young boys. Cannibal Club members truly lived dual lives: honorable gentlemen by day, perverse pleasure-seekers by night. England was teeming with Mrs. Grundy's at the time and cultural non-conformists like Burton and his Cannibals had had enough of her. "Mrs. Grundy is already beginning to roar," Burton once said while working on his translation of the Arabian Nights. "Already I hear the fire of her. And I know her to be an arrant whore, and tell her so, and don’t give a goddamn for her." Mrs. Grundy eventually manifested herself in the Society for the Suppression of Vice and various British obscenity laws such as the Obscene Publications Act of 1857—all established to weed out counter-culturists and prosecute them for their indecency. And although the Cannibal Club had insulated itself to allow for the free and safe airing of subjects deemed deviant by society, it was simultaneously taking it upon itself to challenge prudish conventions and work towards a more liberal London. But the Cannibal Club, as a furtive extension of The Anthropological Society, had motivations beyond simply rabble-rousing. In Reading Arabia: British Orientalism in the Age of Mass Publication, 1880-1930, author Andrew C. Long writes: [The Cannibal Club's] central activity was the production and distribution of colonialist pornography for their circle and other elite consumers. However—and this is key for the formation of colonial and imperial ideology—they justified their activities as the pursuit of science and art, where pornography, or their pseudoscientific combination of sexology and anthropology, would help to understand better the specific sexual practices and culture in the far-flung reaches of the Empire. In the latter half of the 19th century the erotic viewing privileges of the club and its consumers were undermined as British and French companies began to mass-produce pornographic postcards, many of them exploiting colonial imagery much like the Cannibal Club had been doing all along. The Cannibal Club lasted just a few short years, however. After Hunt's death in 1869 and Burton's international diplomatic services took him abroad, the old gang began to thin out. By the early 1870s, Darwin's On the Origin of Species (1859), was selling at a rate of 250 copies per month in Britain and his follow-up, The Descent of Man, and Selection in Relation to Sex (1871), which focused more on sexual selection and evolutionary ethics, had just hit shelves. Subsequently, the racially motivated polygenist ideology adhered to by The Anthropological Society and, by extension, The Cannibal Club, became passé. In his paper, "The Cannibal Club and the Origins of 19th Century Racism and Pornography" (2002), John Wallen asserts that Burton tried to revive the Cannibal Club sometime in the 1870s without success. In 1871, the Anthropological Society and the Ethnological Society of London reunited to form The Royal Anthropological Institute of Great Britain and Ireland, which is active to this day, promoting the public understanding of anthropology. In 1886, Burton, respected geographer, racist and rabble-rouser, was knighted by Queen Victoria. Jeff Campagna is a Canadian journalist, author and adventurer based in Latin America.
8bdb82483261219bd8a67e425d114e78
https://www.smithsonianmag.com/history/capturing-warsaw-at-the-dawn-of-world-war-ii-63976601/
Capturing Warsaw at the Dawn of World War II
Capturing Warsaw at the Dawn of World War II Like other members of his generation, Julien Bryan would never forget where he was or what he was doing when he learned that Germany had invaded Poland. But Bryan had a better reason to remember than most: on that September 3, 1939, he was stopped at what was then the Romanian-Polish border on a train bound for Warsaw. “Why, at this moment, I did not turn around...I do not know,” Bryan would recall of learning of the invasion two days after the onslaught began. With bombs exploding nearby, the train resumed its cautious journey toward the capital—with Bryan on board for a front-row seat at the commencement of World War II. Bryan, who came from Titusville, Pennsylvania, had seen combat as a 17-year-old ambulance driver in France during World War I. After graduating from Princeton, in 1921, he traveled widely, taking photographs and making travelogues or human-interest films along the way. That summer of 1939, he had been shooting peasant life in Holland. On September 7, he disembarked in predawn darkness in besieged Warsaw. “I was in a city about to face perhaps the worst siege of all modern history,” Bryan would write. Other cities, of course, would suffer terrible assaults later in the war—London, Berlin, Hiroshima and many more—but early on, Warsaw was hit by wave after wave of modern bombers, to which the German Army added what Bryan called the “hot steel spray” of exploding artillery as it advanced. While the retreating Polish Army valiantly resisted the advancing German columns, Warsaw’s 1.3 million inhabitants were subjected to furious bombardment. Hospitals, churches and schools were hit. Bryan wrote that a 40-unit apartment building “looked as if a giant with an ice-cream scoop had taken out the entire central section.” Homeless families crowded the streets, pushing what remained of their belongings in wheelbarrows and baby carriages. All this was happening, essentially, out of the world’s sight; Bryan was the sole foreign journalist left in the city. He acknowledged the journalistic tingle of getting “a grand scoop,” but he also recognized the historical imperative to capture the horror of modern warfare for the world to see. “I was not,” he realized, “making a travelogue.” Bryan walked the streets with a Leica still camera and a Bell & Howell movie camera. Day by day the job grew riskier. He confessed that he and his Polish interpreter, Stephan Radlinski, often wanted to run when a bomb landed close by. “But neither of us ran, because each was afraid of what the other might think,” he wrote. On Rosh Hashana, the Jewish New Year, incendiary bombs set 20 blocks of the Jewish quarter aflame. Among shattered buildings near the Vistula River, Bryan took several frames of a boy clutching a bird cage. Twenty years later, after Bryan republished his photographs in a local newspaper, Zygmunt Aksienow identified himself as the boy in the photograph. Now 80, Aksienow recalls that two large bombs had fallen near his family’s apartment building and “the street was full of broken glass, furniture and parts of human bodies.” A bird cage “blew out of a house, along with a window” and landed in the rubble. Aksienow picked it up, thinking that the canary it held—very much alive—might belong to his cousin Zofia, a neighbor. “I was a scared 9-year-old, out looking for some sign of the normal life I was used to,” he says today. Aksienow, who would grow up to be a coal miner, no longer recalls what happened to the canary, but he remembers clearly the cruel winter that followed the invasion. His family’s apartment had been heavily damaged and food was scarce, but just before the traditional Christmas Eve feast, young Zygmunt walked in with two buckets of fish, which he and a pal had stunned by tossing a hand grenade they’d found into the Vistula. Bryan had no idea how he might get out of Warsaw. But on his 14th day there, the Germans declared a cease-fire to allow foreigners to depart by train through East Prussia. Certain that the Germans would confiscate any photographs of the destruction they had wrought, Bryan resolved to smuggle his film out. He gave some to departing companions to hide in their gear, and by one account wound yards of movie film he had the foresight to have processed in Warsaw around his torso. After reaching New York City, he reassembled an awesome trove: hundreds of still negatives and more than 5,000 feet of motion picture film. That autumn, U.S. newspapers and magazines splashed Bryan’s photos across their pages. Life magazine printed 15 of his images, its weekly rival, Look, another 26—including the one of Aksienow with the caged canary. In 1940, Bryan put together a book about his experience, titled Siege; his documentary of the same name was nominated for an Academy Award. Bryan died in 1974, just two months after receiving a medal from the Polish government for his still photography, which is preserved at the U.S. Holocaust Memorial Museum in Washington, D.C. His Warsaw film is listed on the Library of Congress’ National Film Registry as a “unique, horrifying record of the dreadful brutality of war.” Mike Edwards was a writer and editor for National Geographic for 34 years.
0a1705ba93a515b80b05314fd95bd23c
https://www.smithsonianmag.com/history/catching-up-with-old-slow-trot-148045684/
The Civil War
The Civil War Out of the august night, James Gurley came galloping past the massive oak before Elizabeth Thomas' white plantation house. Get out! he shouted. Take your family and run! Now! The renegade slave leader Nat Turner was coming with a band of vengeful slaves, rampaging from farm to farm, killing white men, women and children. George Henry Thomas, 15, piled into a carriage with his mother and sisters and racketed along dirt roads into the darkness. Before they had gone far, afraid the assassins would overtake them, they abandoned the carriage and took to the woods. In and out of gloomy Mill Swamp, across Cypress Bridge and the bottomlands of the Nottoway River, they escaped to the county seat of Jerusalem, some 12 zigzag miles from home. Nat Turner's 1831 insurrection, in Southampton County, Virginia, was the bloodiest slave uprising in American history. Before it ended, 55 whites were killed. It stirred deep fears across the South, sweeping aside any talk of gradual emancipation, and hardened both sides in the long-running debate that ended in civil war. What it did to young George Thomas, who as a Union general became one of the most successful, most controversial, yet least recognized figures of that war, remains a question unsettled. While Turner and his band, armed with guns, clubs, axes and swords, carried out their gruesome task, Thomas' mother led her family to safety, helped to do so by some of her own slaves, according to local tradition. George's father had died two years earlier. The boy's uncle, James Rochelle, who had mentored him since his father's death, was clerk of the court where Turner confessed and was hanged that November. Young George was immersed in the initial panic, the mobilization of militia and the fury of citizens demanding prompt justice. He heard talk that all the trouble would never have happened if Turner had not been taught to read and write. Teaching slaves was illegal in Virginia and across the South, but George was among the many who had broken the law, teaching his own family's 15 slaves to read. After attending the local academy, he became his uncle's deputy clerk and took up the study of law at the county courthouse. But he was restless, and gladly accepted an appointment from his congressman to the U.S. Military Academy at West Point. He would long remember the parting advice he got from his brother John: "Having done what you conscientiously believe to be right, you may regret, but should never be annoyed by, a want of approbation on the part of others." It was advice that would prove prophetic. Nearly six feet tall, solid in body and stubborn in temperament, George was almost 20 years old when he arrived at West Point. His roommate was a red-haired, impulsive Ohioan named William Tecumseh "Cump" Sherman. They became friendly rivals, and after four years Sherman had finished 6th, Thomas 12th, among the 42 members of the class of 1840. Along the way, Thomas put a halt to the hazing of some fellow cadets by threatening to throw a bullying upperclassman out a barracks window; after years helping supervise a sprawling plantation, he had learned to exert calm authority. Among the cadets, his gravitas earned him his first of many nicknames: Old Tom. Five months after graduation, Thomas sailed for Florida and the long, ugly little war begun by Andrew Jackson to force the Seminole Indians onto reservations. Thomas' captain wrote an appraisal that would well describe his entire career: "I never knew him to be late or in a hurry. All his movements were deliberate, his self-possession was supreme, and he received and gave orders with equal serenity." Real war lay ahead in Mexico, where as an artillery lieutenant under Gen. Zachary Taylor in 1846, Thomas won honorary promotion to captain for his conduct in the pitched battle of Monterrey. Then Thomas was breveted to major for the way he handled his guns at Buena Vista, when Taylor defeated Mexican general Santa Anna in the last major battle in northern Mexico. Southampton County was proud of its son, and presented him a magnificent sword, its gold pommel clasping an amethyst, its silver scabbard engraved with the names of his battles. On its grip was the image of an elephant—among soldiers, to have been in combat was to have "seen the elephant." And Thomas was still devoted to home: disappointed that his brother had not picked a bride for him, George said, "I would prefer one from the old state to any other, and as I am now so much of a stranger there I am afraid I should not know where to look. ..." In his letters, he worried about his unmarried sisters, left lonely on the farm, saying "domestic differences are to me the most horrible of which I can conceive." He could not yet imagine the scope of the domestic differences that lay ahead. In 1851 he headed to the prize assignment of artillery instructor at West Point. At every stop since his first arrival there, he had met and measured cadets and fellow officers who would figure in his future—Sherman, J.E.B. Stuart, John Schofield, William Rosecrans, Braxton Bragg, John Bell Hood, among dozens destined to become famous in Civil War history. None was more impressive than the superintendent of the academy, Lt. Col. Robert E. Lee, and no one there impressed Lee more positively than upright, conscientious George Thomas. Under Lee, Thomas had the additional duty of cavalry instructor. In that role, Thomas won yet another nickname, Old Slow Trot, for restraining cadets from galloping their mounts. Since his brother had not found him a bride, Thomas found his own—tall, strong-minded Frances Kellogg, an upstate New Yorker, cousin of a cadet from Troy. He wore his ceremonial sword for the only time in his life when they were married in the academy chapel in November 1852. Within six months, Thomas had to leave his bride for duty in the far Southwest; it would be three years before he saw her again. In a desert clash with a Comanche brave, he narrowly escaped death when an arrow glanced off his chin before lodging in his chest. Thomas pulled it out and, after a surgeon dressed the wound, went about his business. Then, in 1860, with the country in crisis after Abraham Lincoln was elected president, Thomas headed home on leave. While there, he worried about his future as the Southern states began to secede. Governor John Letcher offered to make him Virginia's chief of ordnance. In turning that position down, Thomas wrote: "It is not my wish to leave the service of the United States as long as it is honorable for me to remain in it, and therefore as long as my native State Virginia remains in the Union it is my purpose to remain in the Army, unless required to perform duties alike repulsive to honor and humanity." A month later, in April 1861, on the day Confederate guns opened against Fort Sumter in Charleston Harbor, Thomas sent telegrams to his wife and sisters, stating that he would remain loyal to the Union. We do not know exactly what he said then or what was going on inside him at other critical moments, because all his personal papers were destroyed. But his wife said that "whichever way he turned the matter over in his mind, his oath of allegiance to his Government always came uppermost." When Lincoln called for troops to put down the insurrection, Virginia joined the Confederacy, along with most of her professional soldiers. But Thomas stayed true to his oath, and to this day has been reviled by many Southerners for that decision. Even his own sisters turned his picture to the wall and denied that they had any such brother. They returned his letters unopened and ignored his request to send him the ceremonial sword he had left with them for safekeeping. He also lost contact with his brothers. Some called him a turncoat. The truth is that Thomas, like many other soldiers, was torn by the wrenching decision he was forced to make. So was his friend Lee, who opposed secession and agonized over resigning from the U.S. Army that he had served so faithfully. But Lee ultimately headed South, saying he could not bring himself to fight against his home, family and friends. It is also true that Lee had a much larger stake in Virginia, in its plantations and history, than Thomas did in his more modest place in Southampton. And besides his loyalty to the old flag, Thomas was committed to a Northern wife who was as strongly Unionist as his sisters were secessionist. His memories of Nat Turner's insurrection might have hardened him into a determined defender of slavery, as it did for so many of the Southern officers who went with the Confederacy. Instead—perhaps remembering the eager blacks he had taught to read and write—he fought to overturn the "peculiar institution." Though he left no bold statements of how he felt, when his duty came to include ending slavery, he carried it out just as forcefully as when it stood for simply preserving the Union. Those who protest Thomas' decision have made less of the fact that old Winfield Scott, general in chief of the Army in the early months of the war, was also a Virginian. He had been a national figure since the War of 1812, but by late 1861 he had retired and no longer mattered. Tens of thousands of Southerners fought for the Union, but Thomas has been the focus of resentment for one reason: he was a better general than the others. As early as his cadet days, Thomas' contemporaries had seen a resemblance to George Washington in his classic profile, his integrity and his restrained power. In 48 months of war, as his brown hair and well-trimmed beard began to gray, he would attain a certain grandeur that only strengthened that comparison. He seldom showed his explosive temper, but when he did, it was remembered. He disdained theatrics and politics; to general and future president James A. Garfield, his whole life seemed "frank and guileless." Thus in character, if not in gambling instinct, he also closely resembled Lee, who was a role model for so many younger officers who served under him. Thomas would earn the undying loyalty of soldiers like Henry Van Ness Boynton, who won the Congressional Medal of Honor fighting under him in 1863. Boynton wrote that Thomas "looked upon the lives of his soldiers as a sacred trust, not to be carelessly imperiled. Whenever he moved to battle, it was certain that everything had been done that prudence, deliberation, thought and cool judgment could do under surrounding circumstances to ensure success commensurate with the cost of the lives of men. And so it came to pass that when the war ended it could be truthfully written of Thomas alone that he never lost a movement or a battle." But for Thomas, every battlefield success seemed to stir controversy or the jealousy of ambitious rivals. Unlike other noted generals, he had no home-state politicians to lobby on his behalf in Washington. Ulysses S. Grant, for example, was championed by Illinois congressman Elihu Washburne, and Sherman by his brother, Ohio senator John Sherman. For Thomas, every step upward depended solely on his performance in the field. In one of the war's first skirmishes, he led a brigade in the Shenandoah Valley that bested Confederates under Stonewall Jackson. When the dashing Rebel J.E.B. Stuart heard that Thomas was commanding Union cavalry, he wrote to his wife that "I would like to hang him as a traitor to his native state." Even after that, there was lingering doubt among some Unionists, including Lincoln. Unlike Grant, Sherman, George McClellan and some other ranking Union officers who had broken their military service with years as civilians, Thomas had been a soldier since the day he entered West Point. Yet when his name came up for promotion, the president, restrained by Northern radicals and surrounded in the Federal bureaucracy by Southerners, said, "let the Virginian wait." But Sherman among others vouched for Thomas, and soon the Virginian was elevated to brigadier general and ordered to organize troops away from Virginia, beyond the Appalachians. There, in January 1862, he sent a bulletin of encouragement to a Union hungry for good news. After an 18-day march on muddy roads, his division confronted Rebels at Mill Springs, Kentucky. Amid cold rain and gun smoke, he led his outnumbered troops in repulsing Confederates under Maj. Gen. George Crittenden and then drove them across the Cumberland River. Though not a massive victory, it was the first notable Northern success of the war, turning back a Confederate move from eastern Tennessee into Kentucky. Thomas was promoted to major general, an advancement that would soon create friction with his old roommate "Cump" Sherman and Grant, who had become so close that an affront to either was resented by both. After winning praise for capturing Forts Henry and Donelson in western Tennessee, Grant had fallen out of favor for mismanaging and very nearly losing the bloody Battle of Shiloh. He was criticized for taking 13,000 casualties and was suspected of drinking on the job. Sherman, whose excitability and wild overestimates of Rebel strength had caused some to question his sanity, had fought bravely after an initial mistake at Shiloh. When Union forces moved south toward Corinth, Mississippi, that spring, Union general Henry Halleck shunted Grant into a figurehead role and gave Thomas temporary command of the wing that included Grant's Army of the Tennessee. Grant, angered, was talked out of quitting by Sherman. Grant would not forget the incident. Grant and Sherman would redeem themselves by grasping control of the Mississippi River in the costly, circuitous campaign that resulted in the capture of Vicksburg in mid-1863. While they were operating on the Mississippi, Thomas led a corps in Rosecrans' Army of the Cumberland, earning respect in fights like that at Stones River, where he declared, "This army does not retreat," and backed up his words with actions on the field. There and at Tullahoma, Rosecrans' force pressed the Confederates back into eastern Tennessee. As Thomas rose, he proved to his men that his addiction to detail and his insistence on preparation saved lives and won battles. His generalship behind the front, before the battle, was generations ahead of his peers. He organized a professional headquarters that made other generals' staff work seem haphazard. His mess and hospital services, his maps and his scouting network were all models of efficiency; he was never surprised as Grant had been at Shiloh. He anticipated modern warfare with his emphasis on logistics, rapidly repairing his railroad supply lines and teaching his soldiers that a battle could turn on the broken linchpin of a cannon. He demanded by-the-book discipline, but taught it by example. He made no ringing pronouncements to the press. His troops came to understand his fatherly concern for their welfare, and when they met the enemy they had faith in his orders. In late summer, Rosecrans moved against the Rebel stronghold of Chattanooga, a crucial gateway between the eastern and western theaters of war. Confederate general Bragg pulled out of the town onto the dominating nearby mountains, waiting for Maj. Gen. James Longstreet to bring reinforcements from Virginia. When they came, Bragg threw everything into an assault on Union lines along Chickamauga Creek, just inside Georgia. Thomas' corps was dug in on the Union left. On the second day of furious fighting, a misunderstood order opened a wide gap on his right. Longstreet's Rebels crashed through; with the always aggressive John Bell Hood's division leading, they bent the Union line into a horseshoe. Rosecrans, certain the battle was lost, retreated into Chattanooga with five other generals and thousands of blue-uniformed soldiers. But Thomas inspired his men to stand fast, and only their determined resistance saved his army from destruction. They held all that afternoon against repeated Confederate assaults, withdrawing into Chattanooga after nightfall. It was the greatest of all battles in the West, and since that day, Thomas has been known to history as the Rock of Chickamauga. For their actions, Rosecrans was fired and Thomas took command of the Army of the Cumberland. But the Union situation remained dire. Bragg, still holding those formidable mountains, laid siege to Chattanooga. Grant, commanding Union armies between the Mississippi and the mountains, ordered Thomas to hold the city "at all costs," and rushed troops east to help. "I will hold the town till we starve," Thomas replied, and they almost did starve. Cut off from supplies, his army was living on half rations. Thousands of horses and mules died. Weeks passed before Grant assembled strength sufficient to lift the siege. The key terrain was towering Missionary Ridge. Grant ordered Sherman to drive onto the ridge from the left and Maj. Gen. Joseph Hooker from the right, with Thomas aimed at the center. Sherman tried and failed to carry his end, but Hooker's troops took Lookout Mountain on the far flank. Thomas waited for Grant's order to advance. When it came, Thomas took his time studying the crest with his binoculars, then sent his troops ahead with orders to occupy only the first line of the Confederate works. They did so in fine style—and then, seeing that they were exposed to fire from above, kept going. Thomas was surprised and Grant angry, demanding "Who ordered those men up the hill?" No one had. The troops plunged ahead, pressing on against heavy fire, struggling up the steep slope and jubilantly planting their flag on the heights for all to see. Assistant Secretary of War Charles A. Dana, an eyewitness, called the assault "one of the greatest miracles in military history....as awful as a visible interposition of God." Thomas, moved by the sight, ordered that a cemetery be created for his soldiers on a beautiful slope of the battlefield. When a chaplain asked if the dead should be separated by state, Thomas did not hesitate. "No, no," he said. "Mix them up. Mix them up. I'm tired of states' rights." Once he had made up his mind to stay with the old flag, he never expressed misgivings; if he had them, they had long been erased by seeing so many men die to preserve the Union. By late 1883, U.S. Colored Troops were filling some of the gaps opened in Federal forces by battle and disease. Although Sherman had resisted using black soldiers, Thomas gladly accepted them. In the drastic move from serfdom to freedom, he wrote, it was probably better for ex-slaves to be soldiers, and thus gradually learn to support themselves, than "to be thrown upon the cold charities of the world without sympathy or assistance." As the Federals gathered strength to thrust into Georgia, this was not the only disagreement between the tightly strung Ohioan and the calm Virginian. In early March, Lincoln called Grant east to become general in chief of all Northern armies. No one was surprised that Grant's friend Sherman, rather than Thomas, replaced him as commander in the West, even though as a major general Thomas was senior to Sherman. Ex-colonel Donn Piatt, a 19th-century booster and biographer of Thomas, called it "the nakedest favoritism that ever disgraced a service." At the start of his 1864 drive toward Atlanta, Sherman rejected Thomas' plan to take his command through Snake Creek Gap to cut off and smash Joseph Johnston's Confederate army. More than a month into Georgia, an impatient Sherman complained to Grant that Thomas' Army of the Cumberland was slowing his advance—"a fresh furrow in a plowed field will stop the whole column." He was still in this mood a few days later when he ignored Thomas' advice against attacking the strongly entrenched Rebels head-on at Kennesaw Mountain. The Federals lost more than 2,000 troops in trying to take what Thomas had warned was an impregnable position. Thomas commanded about two-thirds of Sherman's infantry; his army was the center force, the sledgehammer in the four-month campaign, and led the way into Atlanta. But neither Sherman, Grant, Secretary of War Edwin Stanton nor Lincoln cited Thomas in their congratulations. As in the 1864 Virginia campaign, where all the official praise and headlines went to Grant, in Georgia it was all Sherman. In his special order announcing the victory, Sherman credited Maj. Gen. Henry W. Slocum's corps with entering the city first—although Slocum was under Thomas' command and had headed the corps for only six days. When Atlanta's mayor protested Sherman's harsh military rule, the general replied, "War is cruelty and you cannot refine it...those who brought war into our country deserve all the curses and maledictions a people can pour out....You might as well appeal against the thunderstorm." Then he set out on his storied march to infamy and greatness, pillaging the countryside as he cut a great swath through the Confederacy. Thomas took a different view. Stern though he was in combat, he posted a guard at the house of a citizen suspected of disloyalty because, he said, "We must remember that this is a civil war, fought to preserve the Union that is based on brotherly love and patriotic belief in the one nation....The thing becomes horribly grotesque...when we visit on helpless old men, women, and children the horrors of a barbarous war. We must be as considerate and kind as possible, or we will find that in destroying the rebels we have destroyed the Union." Opposite in personality, tactics and philosophy, Thomas and Sherman were thereafter gratefully separated in geography as well. While Grant grappled with Lee in Virginia and Sherman gutted the eastern Confederacy, Thomas was sent back to Tennessee to reorganize the stripped-down Army of the Cumberland and deal with Hood. The Confederate general had got away from Atlanta with some 40,000 troops and evaded Sherman's effort to catch him. Now he was marching north through Tennessee. Thomas' Federals under John Schofield slowed and badly damaged the Rebels in the fierce battle of Franklin, but by December Hood was dug in on the high ground facing Nashville. Thomas fortified the city while he gathered strength for a decisive blow, but to carry it out he needed more men, horses and supplies. Grant, 500 miles away, grew impatient. He sent telegrams urging Thomas to move, then ordered him to "attack at once." Thomas said after the war that he was tempted—"grossly improper as it would have been"—to ask why Grant himself, who was entrenched around Petersburg, was not fighting. Defeat at Nashville "would have been a greater calamity than any which had befallen the Federal forces," he said. "It would have cleared the way for the triumphant march of Hood's army through Kentucky, and a successful invasion of Indiana and Illinois, in which there were no Federal troops. It was therefore of the last importance that the battle upon which so much depended should not be fought until I was ready for it." Thomas continued planning, training, stocking—equipping his horsemen with the new breech-loading Spencer carbines. Then, just when he was ready, a sleet storm froze both armies in place for days. Grant, furious that Thomas had failed to engage the enemy, decided to relieve him from command, first with one general, then another. Finally he started to go west to fire him in person. But before he left Washington, the ice melted in middle Tennessee. On December 15, Thomas, unaware that Grant intended to fire him, roared out of his works against Hood. In two days his troops crushed the Rebel army. His infantry, including two brigades of U.S. Colored Troops, smashed into Hood's troops while the Union cavalry, dismounted with its fast-firing Spencers, curled around and behind the Rebel left. Almost a century later, historian Bruce Catton summed up the battle in two words: "Everything worked." Thomas "comes down in history...as the great defensive fighter, the man who could never be driven away but who was not much on the offensive. That may be a correct appraisal," wrote Catton, an admirer and biographer of Grant. "Yet it may also be worth making note that just twice in all the war was a major Confederate army driven away from a prepared position in complete rout—at Chattanooga and at Nashville. Each time the blow that finally routed it was launched by Thomas." Nashville was the only engagement in which one army virtually annihilated another. Thomas B. Buell, a student of Civil War generalship, wrote that in Tennessee, Thomas performed the war's "unsurpassed masterpiece of theater command and control....So modern in concept, so sweeping in scope, it would become a model for strategic maneuver in 20th-century warfare." After it, there was no more large-scale fighting west of the Blue Ridge. When the bloodshed was over at last, after Lincoln was assassinated and the nation was recovering from the shock, 150,000 soldiers of all the Union armies converged on Washington for the most memorable victory parade in the nation's history. All of them, that is, except the Army of the Cumberland. When Sherman proudly passed in review before Grant, President Andrew Johnson and multitudes of cheering onlookers, Thomas had already said goodbye to his few remaining troops. Back in Nashville, in a message that his innate reserve did not let him utter in person, he described his thoughts as he watched their last parade: "The coldest heart must have warmed" at seeing the men who had endured "this great, modern tragedy," he wrote—men "who had stemmed with unyielding breasts the rebel tide threatening to engulph the landmarks of freedom, and who, bearing on their bronzed and furrowed brows the ennobling marks of the years of hardship, suffering and privation, undergone in defense of freedom and the integrity of the Union, could still preserve the light step and wear the cheerful expressions of youth." Thomas' own youth was long behind him. In four years of hard service, he had taken not a single day of leave. During Reconstruction, he commanded troops in Kentucky, Tennessee, Mississippi, Alabama and Georgia. He was considerate toward ragged defeated soldiers, but he was as strict as the angriest Northern Radical in opposing the Ku Klux Klan and defiant politicians. "Everywhere in the states lately in rebellion, treason is respectable and loyalty odious," he said. "This, the people of the United States, who ended the rebellion and saved the country, will not permit." When President Johnson wanted to make him a full general, Thomas declined, understanding the move as Johnson's attempt to sidetrack Grant's progress toward the White House. He said he had done nothing since the war to deserve promotion, and if the honor was for wartime service, it had come too late. When he heard talk of nominating him for president, he staunched that too. So Grant was duly elected, in 1868, and soon afterward transferred Thomas to San Francisco. There, in 1870 at the age of 53, the Rock of Chickamauga suffered a stroke and died. The train bearing his body crossed the country to his wife's hometown of Troy, New York, with troops firing salutes along the way. President Grant and General in Chief Sherman, putting aside for the moment their criticism of Thomas, led the throng of mourners at the funeral. But no one was there from the Thomas family of Southampton County. Shortly after Lee's surrender, Union general John Gibbon had heard that the Thomas sisters were suffering, and sent them a wagonload of supplies as a token of his friendship for their brother. Judith Thomas would not accept, insisting she had no brother George, that he had died on the day Virginia seceded. In 1879, veterans of the Army of the Cumberland dedicated an equestrian statue of Southampton's most distinguished son in Washington's Thomas Circle. He peers down 14th Street toward Virginia today, as dense traffic runs around him; perhaps one passerby in a thousand knows who he is and what he did for the nation. After Thomas died, Grant was able to say that he was "one of the great names of our history, one of the greatest heroes of our war." Sherman relented so far as to write that "during the whole war his services were transcendent." Yet even then, the two generals seldom mentioned his name without repeating their assertions of his caution. When the two surviving Thomas sisters were nearing 90, they allowed the general's prize sword to go to the Virginia Historical Society in Richmond, where it remains. As a further gesture of reconciliation, they sent acorns from the great oak outside the home place to be planted around his statue in Washington. The acorns never sprouted. Ernest B. "Pat" Furgurson is the author of Freedom Rising and other Civil War books. He lives in Washington, D.C. Ernest B. Furgurson is the author of Freedom Rising: Washington in the Civil War and Ashes of Glory: Richmond at War, plus other books about war and politics.
49fa179d77befe9e1b2998eca94bc4a3
https://www.smithsonianmag.com/history/cave-markings-tell-cherokee-life-years-indian-removal-180971928/
Cave Markings Tell of Cherokee Life in the Years Before Indian Removal
Cave Markings Tell of Cherokee Life in the Years Before Indian Removal On April 30, 1828, a Cherokee stickball team stepped into the underworld to ask for help. Carrying river-cane torches, the men walked into the mouth of Manitou Cave in Willstown, Alabama, and continued nearly a mile into the cave's dark zone, past impressive flowstone formations in the wide limestone passageway. They stopped inside a damp, remote chamber where a spring emerged from the ground. They were far from the white settlers and Christian missionaries who had recently arrived in northeastern Alabama, putting increasing pressure on Native Americans to assimilate to a Euro-American way of life. (In just a few years President Andrew Jackson would sign the Indian Removal Act that would force the Cherokee off their land and onto the Trail of Tears.) Here, in private, the stickball team could perform important rituals—meditating, cleansing and appealing to supernatural forces that might give their team the right magic to win a game of stickball, a contest nicknamed "the little brother of war." This spiritual event, perhaps ordinary for the time but revelatory now, only recently became known because of a set of inscriptions found on the walls of the cave. A group of scholars have now translated the messages, left by the spiritual leader of the stickball team, and describe them in an article published today in the journal Antiquity. Prehistoric ancestors of the Cherokee left figurative paintings inside caves for centuries, but scholars didn't know that Cherokee people also left written records—documents, really—on cave walls. The inscriptions described in the journal article offer a window into life among the Cherokee in the years immediately before they would be forcibly removed from the American southeast. "I never thought I would be looking at documents in caves," says study co-author Julie Reed, a historian of Native American history at Penn State and a citizen of the Cherokee Nation. The inscriptions were written in the Cherokee syllabary, a writing system that was formally adopted by the Cherokee just three years prior in 1825. It quickly allowed a majority of the tribe to become literate in their own language, and the Manitou Cave inscriptions are among a few rare examples of historic Cherokee writing recently found on the walls of caves. "Cavers have been going in caves in the Southeast for a really long time, looking for more prehistoric artwork," says Beau Carroll, the lead author of the study and an archaeologist with the tribal historic preservation office of the Eastern Band of Cherokee Indians. "For you to be able to pick out actual syllabary you have to be familiar with it. I think it's all over the place. It's just that nobody's been looking for it.” In 2006, a historian and a photographer had been documenting the English-language signatures and graffiti in Manitou Cave, which had become a tourist attraction by the late 19th century. They recognized writing that didn't look like English and showed photos to Jan Simek, an archaeologist at the University of Tennessee in Knoxville, who studies rock art in the region. The cave, which is on private land, was sold shortly after the first inscription was photographed, Simek says, and the new owner of the cave would not allow access to anyone. So Simek and his colleagues couldn't document the writings for themselves until the cave changed hands again in 2015. "Prehistoric people made art inside—sometimes deep inside—many caves in the area, and in some cases, those go back 6,000 years," Simek says. "The writing was important because it suggested some continuity with a tradition that we knew went very far back in the past, so we started to record this stuff. It was a writing system that we couldn't read or write so we asked Cherokee scholars to come help us do it." At the dawn of the American Revolution, the Cherokee homeland covered parts of Tennessee, North Carolina, South Carolina and Georgia. Just after the war, groups of Cherokee who had fought with the British fled contact with the United States and took up residence in Alabama; many took refuge in Willstown, now known as Fort Payne after the U.S. fort that was established there in 1830 as a concentration camp for the Cherokee during Indian Removal. Among Willstown's new residents was Sequoyah, a Cherokee silversmith and scholar, sometimes named as George Guess. Sequoyah thought it would be useful for the Cherokee to have a written language, and he invented a syllabary—easier to learn than an alphabet—made up of symbols for all 85 syllables in the spoken language. After its adoption as the Cherokee Nation's formal writing system, the syllabary went into wide use. The first Native American newspaper, the Cherokee Phoenix, was published in syllabary and English starting in February 1828. "The syllabary is a new innovation in Cherokee society, and it's occurring at the same moment the U.S. government is adamantly pushing its 'civilization' policy—it wants them to Christianize, it wants them to seek English-language education, it wants them to change their gender roles relative to farming so that men are farming and women are relegated to the home," says Reed. The early 19th century was a time of upheaval, especially in Willstown, where the population was growing as more Cherokee people arrived, displaced from their homeland. Among the Cherokee vigorous debates broke out about political and social interaction with whites, and mixed embraces of various "civilization" features. "The great part of Sequoyah's invention is that on one hand it is a trapping of civilization—a written language—and on the other hand it's an affront to the civilization policy because it is the Cherokee language and it enables literacy so quickly that it does the work of revitalizing older pieces of Cherokee tradition," Reed says. As the paper in Antiquity describes, one charcoal translates to "leaders of the stickball team on the 30th day in their month April 1828." A few meters away, another inscription on the wall refers to "we who are those that have blood come out of their nose and mouth," and it is signed by Richard Guess, Sequoyah's son, and one of the first to learn the syllabary. The researchers have interpreted these texts as records of stickball rituals, led by Guess, before the men went out onto the field, and after the game, when they were bruised and bloodied from a touch contest. Stickball was a game similar to lacrosse, with two teams playing on an open field trying to move a ball into the opposition's goal using sticks with nets at the end. It could last for days and was sometimes used to settle disputes between communities, but the sport also had a ceremonial importance to the Cherokee. The players performed rituals before and after the contests that replicated rituals that would need to happen before and after war, and access to sacred sources of water was important during these ceremonies. According to Carroll, the archaeologist and co-author, stickball contests were essentially seen as a face-off between two medicine men. "Whoever's magic is the strongest is the one who's going to win the game," says Carroll, who has played stickball himself. Adds Reed: "These games could get extremely violent and sometimes resulted in death for the players. Anytime blood is in involved, that substance being outside the body can throw the world out of balance. So ceremonies have to be performed to bring the world [back] into balance." The researchers suspect that this particular team went so far into the darkness of the cave because they sought seclusion from the Christian missionaries who greatly disapproved of stickball and its associated religious activities. (Carroll also says it probably would have been important for the players to be far away from the opposing team.) President Jackson's Indian removal policy became law just a few years after that game, in 1830. Some of the players might have been interned at Fort Payne during this campaign of ethnic cleansing, and by 1839, most Cherokee were forced off the land into new "homes" in reservations in Oklahoma. Manitou Cave was opened as a tourist attraction in 1888, but its indigenous history was largely unknown. Modifications to make the passages more tourist-friendly likely destroyed archaeological deposits that might have held clues about the cave's past uses by Native Americans. George Sabo, director of the Arkansas Archeological Survey, who wasn't involved in the study, says the new evidence "anchors important events in early 19th-century Cherokee history onto a specific locale comprising one element of a larger, sacred landscape." A few other syllabary inscriptions have been recorded in Manitou Cave, and in other caves nearby. Not all of the syllabary translations from Manitou Cave have been included in the paper. Carroll says he consulted with fellow community members to decide what texts should and should not be published for a non-Cherokee audience, as inscriptions contain descriptions of spiritual ceremonies that were not intended for public consumption. Manitou Cave, like many caves that contain Native American rock art in the southeast, is now on private land. Its current steward bought the cave and surrounding land in 2015 with the intention of preserving the site. The Eastern Band of Cherokee Indians contributed funds for a strong steel gate at the cave entrance to protect the inscriptions. The authors of the study have emphasized the importance of the collaboration between white archaeologists and Cherokee scholars in studying the inscriptions. "We would not have been able to develop as rich and textured a vision of what this archaeological record means without collaborating with our Native American colleagues," says Simek. "Cherokee people are still here, we haven't gone anywhere, we're interested in our history and we can contribute to science and this paper's proof of that," Carroll says. "It doesn't make any sense to me to do all this historical research and archaeology but you don't include the living descendants of the people you're studying." Megan Gannon is a science journalist who often writes about archaeology and space. She was previously a news editor at Live Science and Space.com.
da6e95c5ecb7f571521d4ab0591475bd
https://www.smithsonianmag.com/history/celebrated-korean-war-hero-chew-een-lee-died-88-years-old-180950063/
Korean War Hero Kurt Chew-Een Lee, the First Chinese-American Marine, Dies at 88 Years Old
Korean War Hero Kurt Chew-Een Lee, the First Chinese-American Marine, Dies at 88 Years Old On November 2, 1950, Marine Corps Lieutenant Kurt Chew-Een Lee struck out ahead of his unit in the midst of a blizzard in the mountains of Northeast Korea. Lee commanded a machine-gun platoon in First Marine Division, and they were facing advancing Chinese troops deployed to aid North Korean forces. By drawing enemy fire and yelling phrases in Mandarin, he confused and exposed the position of Chinese units. His bravery enabled his unit to take a Chinese-occupied base, despite their significantly lower numbers. Lee died last week at the age of 88. For his heroism during the Korean War, he received the Navy Cross, the Silver Star, and two purple hearts, and before retiring in 1968, he rose to the rank of major. Born in northern California in 1926, Lee became the first Chinese-American Marine in 1946. As the only Asian American in his unit, Lee initially faced his fare share of prejudice and racism from fellow Marines and those under his command. Driven by a sense of patriotism, Lee earned their respect, though. "I wanted to dispel the notion about the Chinese being meek, bland and obsequious," he told the Los Angeles Times in 2010. Around 20,000 Chinese Americans served in World War II, and many served in Korea, as well, including two of Lee’s brothers. Following World War II, the U. S. Army dropped the designation “Asian American” and abolished segregated units. As a result, exact estimates of the number of Chinese Americans who served in Korea remain unknown. In 2010, the Smithsonian Channel produced a documentary called “Uncommon Courage: Breakout at Chosin," which focused on one of Lee’s most famous exploits. In December of 1950, he lead 500 marines on a rescue mission to save another unit of 8,000 men at the battle of Chosin reservoir. For more on Kurt Chew-Een Lee and his heroic story, take a look at these clips. Helen Thompson writes about science and culture for Smithsonian. She's previously written for NPR, National Geographic News, Nature and others.
74f8d6fca6f034b6e5ec1727cb7de7b9
https://www.smithsonianmag.com/history/celebrating-500-years-germans-beer-purity-law-180958878/?no-ist
Celebrating 500 Years of German’s Beer Purity Law
Celebrating 500 Years of German’s Beer Purity Law In Germany rules are rules—and they even apply to something as fun as beer. The Reinheitsgebot, Germany's legendary beer purity law, turns 500 on April 23 in a sudsy celebration known as German Beer Day. Many Germans love the law, but others think it's an outdated relic that should be chucked. Beer purity is only one part of the Reinheitsgebot story; protectionism, taxes, national pride and marketing all come into play. How Was the Reinheitsgebot Born? German expat Horst Dornbusch, an award-winning brewer, beer judge and writer, says the rule’s 500-year history is rather surprising. “That small 31-word passage in an obscure proclamation, crafted at the height of the Renaissance, has since acquired an almost mythical status in Germany. It is hailed as the country’s indispensable guardian of beer quality,” he notes via e-mail. The Reinheitsgebot, which may be the world's oldest active consumer protection law, predates the birth of today's unified Germany. Duke Wilhelm IV of Bavaria handed it down in the city of Ingolstadt half a millennium ago as the Substitutionsverbot (substitution prohibition). Even 500 years ago, it was rather old news. Previous efforts to control the quality and pricing of German beer date back to at least 1156, when Augsburg adopted a similar statute under the Holy Roman Emperor Frederick “Barbarossa.” Wilhelm's Bavarian law spread gradually across the land until, in 1906, it became mandatory throughout Germany by an Imperial Act under the reign of Kaiser Wilhelm II. But it wasn't until World War I that a Bavarian legislator introduced a new phrase to describe the law during a beer tax debate—the Reinheitsgebot. Though the statute hadn't been known as a “beer purity” law before this time the phrase was embraced immediately and has been used enthusiastically ever since. Why Was it Passed? Sixteenth century 'breweries' bore little resemblance to modern facilities with their gleaming stainless steel tanks. Open fires boiled brew kettles and fermentation took place in pitch-lined wooden vats. The process was less than hygienic and contamination was common. Brewers regularly added questionable ingredients like wood shavings, roots, and even poisonous (though hallucinogenic) plants like fungi or henbane. The results were intoxicating but they could also make imbibers sick—or worse. Beer in this era wasn't just for relaxation, it was a dietary staple for many and the act aimed to ensure a safe, reliable supply. It also helped set prices and profit margins to make sure beer was affordable—in fact the vast majority of the law deals with pricing and penalties designed to hit brewers in the pocketbook. And the Reinheitsgebot was as much a baker protection act as a beer law. Banning wheat in beer kept it available and cheap for bakers, while brewers used still less expensive barley. This edict ensured that the people would have plenty of bread to wash down with their beers. The original law restricted beer brewers to three ingredients; “thou shalt use no other piece than barley, hops, and water for making beer,” it stated. However it has since been tweaked many times over. Yeast, which converts sugar to alcohol and CO2 during fermentation and helps determine a beer's flavor, wasn't included because the law predates its discovery. During old-fashioned open air brewing processes (which some brewers still employ today) there was enough airborne yeast, particularly near bakeries where beer was often brewed, that it was naturally added to the mix unnoticed by brewers. After Antonie van Leeuwenhoek's microscope debuted in 1676 scientists gradually uncovered the role of yeast in fermentation and the law was amended to include it.  Another notable addition was malted wheat, which enables brewers to make familiar German top-fermented styles like Hefeweizen, Alt, and Kölsch. What's a 'Beer' Anyway? After complaints by French brewers, the EU Court of Justice struck down the Reinheitsgebot in 1987 as a protectionist measure. The ruling opened German shelves and taps to foreign brews—no longer could the country use the law to ban imported beers brewed outside of its edicts. Germans did stick to the statute themselves, but a 2005 German court interpretation loosened the reins a bit further. The brewing and sale of non-compliant beers was allowed as long those products aren't labeled as 'beer.' This has led to a spate of “special beer” and “beer mixed drinks,” though actual prosecutions of more egregious violators, in terms of fines and/or jail time, are unheard of. But in uber-strict Bavaria, however, they are still known to destroy beers like milk stouts under the premise that they are 'misleading' consumers. But the law does have an important practical function with teeth. It's part of the tax code by which the government takes its cut of brewer profits. Honoring Tradition Beer has lost a bit of its luster in the nation where modern brewing methods were born. Consumption has been been sliding and statistics show it's down perhaps one-third since the 1970s, though Germans still rate among the world's most prolific beer drinkers. Among German beer lovers the Reinheitsgebot varies in importance, perhaps according to the Biergarten surveyed, but enjoys strong support. The German Brewers' Association, which represents the nation's mainstream breweries and rigorously defends the Reinheitsgebot, cites a survey conducted by Germany's Forsa Institute that found 85 percent of drinkers favored upholding the law. To traditional brewers the Reinheitsgebot is a guarantee of quality, a defense against an onslaught of cheap rice and corn-based beers, and not least a powerful marketing tool to promote their product. To those who charge the law stifles creativity and innovation the German Brewers Association counters that monotony isn't a concern. Since the law allows the use of 100 kinds of hops, 40 malts, and 200 yeast strains drinkers can enjoy a different kind of beer brewed under the law every day for 15 years without duplication—even while brewers continue to create more. “So, there is absolutely no incentive for German brewers to let the 500 year-old document fade into the past. Quite the opposite,” the association asserts in a statement for the law's anniversary. Time For a Change? Yet drinkers vote with their wallets and popular beer styles around the world, including both traditional offerings and many in the rapidly growing craft beer craze, often use fruits, cocoa, coffee and any number of other natural ingredients to naturally flavor beers that would be forbidden under the Reinheitsgebot. The trend has reached Germany as well, where many craft brewers are ignoring the Reinheitsgebot. “I just brew what I want right past the law,” an unnamed German brewer recently told Dornbusch. “I call my beers by their style designations and simply leave the world ‘beer’ off the label.” San Diego's Stone Brewing is opening Germany's first U.S.-owned brewery. To mark the Reinheitsgebot's 500th anniversary Stone is hosting a “Reinheitsverbot”  event that will pour only non-compliant beers, of both Stone and German origin, at their Berlin facility. Even in the U.S. 95 percent of Stone's beers do meet the Reinheitsgebot, including all the brewery's year-round offerings. But they don't brew with the law in mind, and certainly aren't afraid to deviate with the use of coffee, fruit, cocoa, or other natural ingredients when desired. “It is a fact that high quality beers can be brewed both inside and outside of the Reinheitsgebot and that cheap beers can be brewed both inside and outside of the Reinheitsgebot,” Stone Brewing CEO and co-founder Greg Koch told the German press in advance of the event, laughingly adding that he wondered why “someone [would] wish to have their choices limited for them by a 500 year-old taxation law.” Horst Dornbusch suggests a simplification of the law could preserve tradition and invite innovation at the same time by simply stating what Germans clearly do not want in their beer. “No rice, no corn, no chemicals, no enzyme preparations, and (because we are in the 21st century) no GMO raw materials,” he suggests. “That’s 14 words, less than half the length of the 31 words of the ducal decree of 1516! Then leave the rest to the brewer.” Brian Handwerk is a freelance writer based in Amherst, New Hampshire.
63ccd866ad7a810c0edb248a837cf702
https://www.smithsonianmag.com/history/chambon-french-town-refugees-180969356/
In the yard of the stone elementary school with the tile roof in Le Chambon-sur-Lignon, a town of just 2,700 people on a high plateau in south-central France, kids play and horse around like school kids everywhere. Except they sometimes chatter in different languages: They’re from Congo and Kosovo, Chechnya and Libya, Rwanda and South Sudan. “As soon as there’s a war anyplace, we find here some of the ones who got away,” says Perrine Barriol, an effusive, bespectacled Frenchwoman who volunteers with a refugee aid organization. “For us in Chambon, there’s a richness in that.” More than 3,200 feet in elevation, the “Montagne,” as this part of the Haute-Loire region is called, first became a refuge in the 16th century, when residents who converted to Protestantism had to escape Catholic persecution. In 1902, a railroad connected the isolated area to industrial cities on the plain. Soon Protestants from Lyon journeyed there to drink in the word of the Lord and families afflicted by the coal mines of Saint-Étienne went to breathe the clean mountain air. Thus Chambon-sur-Lignon, linked to Protestant aid networks in the United States and Switzerland, was ready for the victims of fascism. First came refugees from the Spanish Civil War, then the Jews, especially children, in World War II. When the Nazis took over in 1942, the practice of taking in refugees—legal before then—went underground. Residents also helped refugees escape to (neutral) Switzerland. In all, people in and around Chambon saved the lives of some 3,200 Jews. Local archives have not yielded one instance of neighbor denouncing neighbor—a solidarity known as le miracle de silence. In 1990, the State of Israel designated the plateau communities as “Righteous Among the Nations” for their role during the Holocaust, a supreme honor usually bestowed on an individual and given to just one other collectivity, a town in the Netherlands. The untold story of an isolated French community that banded together to offer sanctuary and shelter to over 3,500 Jews in the throes of World War II The tradition of opening their homes to displaced people continues today. In the village of Le Mazet-Saint-Voy, Marianne Mermet-Bouvier looks after Ahmed, his wife, Ibtesam, and their two small boys, Mohamed-Noor, 5, and Abdurahman, 3. The family arrived here last winter and live for now in a small apartment owned by Mermet-Bouvier. They lost two other children during the bombing of Aleppo, and then spent three years in a Turkish camp. That’s where the French government’s Office Français de Protection des Réfugiés et Apatrides found the family. But even with entry papers, somebody in France had to put them up. Their sponsors, not surprisingly, were here on the plateau. Ahmed and his wife, now six months pregnant, smile often, and the word that keeps coming up in Ahmed’s choppy French is “normal.” Despite the upheavals of culture and climate, Ahmed finds nothing strange about being here, which, after the hostility he and his children encountered in the Turkish camps, was a thrilling surprise. “Everybody here says bonjour to you,” Ahmed marvels. Hannah Arendt coined the phrase “the banality of evil” to explain how easily ordinary people can slip into monstrosity. The Bulgarian-French philosopher Tzvetan Todorov advanced its lesser-known opposite: the banality of goodness, which is what you run into a lot around here. The locals are sometimes known as les taiseux—the taciturn ones—because they hate making a fuss about their kindness to needy outsiders. Still, their generosity is extraordinary at this moment in history, when much of the world (including parts of France) is in a fever about immigrants and refugees, erecting walls and laws and political parties to keep “others” out. Hervé Routier sits on Chambon’s municipal council and also teaches French to young immigrant men, using the driving-test manual as his text. “It’s not a decision we reflect on, it’s always been spontaneous,” Routier said of giving assistance. “We just keep doing what we’re doing.” Margaret Paxson, an anthropologist who lives in Washington, D.C., learned recently that she has family ties to Chambon and is writing a book about the region. “This story is about now,” says Paxson. “Not because we need to turn the people who live here into angels, but because we need to learn from them.” Next to the old elementary school stands a modern structure: the Lieu de Mémoire, or Place of Memory. The little museum, opened in 2013, is dedicated to the role of Chambon and nearby villages in sheltering refugees, Jewish children in particular. Its holdings include photographs, archives and videotaped first-person accounts from villagers and individuals who were rescued. Gérard Bollon, a historian and resident, takes pride in the view from the museum’s second floor, which looks out on the schoolyard. “You see our little kids rushing toward the kids who have arrived from elsewhere, kids who don’t speak a word of French, and take them by the hand. There it is! We’ve succeeded. That’s our lineage.” Photography for this piece was facilitated by a grant from the Pulitzer Center on Crisis Reporting. This article is a selection from the July/August issue of Smithsonian magazine
9dc20ce2fc2071b2ba84006ffeb632dc
https://www.smithsonianmag.com/history/charlatan-ozarks-still-looms-over-haunted-crescent-hotel-180973743/
They found the bottles buried on the edge of the woods behind the hotel last spring. Specimen bottles mostly, each about the size of a jam jar give or take, many intact, some still filled with clear liquid yellowing to brown then molasses black, the wax and rubber seals of the glass stoppers cracked but somehow holding after 80 years in the ground; others pristine, as if they’d just been shipped from the lab. In the first hour or so the groundskeeper unearthed scores of them, but even in the bright light of morning it was impossible to judge which body parts they held. By the end of that day the staff had pulled hundreds from the tree line down the hill and everyone in Eureka Springs knew what they’d found. The next day everyone in Arkansas knew and by the end of the week everyone in America knew they’d found bottles and jars and “body parts” behind the Crescent Hotel, and that the sheriff and the coroner had come and gone, and the archaeology team from the university in Fayetteville was on its way. * * * Built in 1886, the Crescent Hotel in Eureka Springs, Arkansas, is the grandest resort in the Ozarks. By the time of the Great Depression it sits empty. In 1937 it becomes the Baker Hospital. Norman Baker claims to have a cure for cancer. The colorful mailers and brochures he sends out refer to the area as the “Switzerland of America,” under the cheery banner, “Where Sick Folks Get Well,” and promise that cancer is curable without the knife or radium or X-ray. But Norman Baker is a quack. That’s why the medical board ran him out of Iowa, then chased him down to Mexico. This article is a selection from the January/February 2020 issue of Smithsonian magazine Norman Baker is not a doctor. Norman Baker is a charlatan. He is conventionally handsome. Sharp-featured and clear-eyed. Confidence-inspiring, straight out of a B-movie casting call, he has a well-cut head of distinguished gray hair and a strong jaw balanced on a celluloid shirt collar. Is the gaze level? The handshake firm? Are the palms cool and dry? You bet. His patients—the suckers and the gulls and the true believers—are weak with sickness, desperate, pale. They come from everywhere, and pay cash for the treatments. They pay and pay. Baker promises life, he promises vigor. No surgery. No radiation. Just Formula 5 and the power of positive thinking. Baker drives a hand-hammered, coach-built Cord automobile custom-painted in electric lavender. He wears dark chalk-stripe three-piece suits in winter, white suits and matching shoes in summer. Lilac shirts year-round. He sports a diamond horseshoe stickpin and a watch fob heavy as an anchor chain. In the leanest years of our history, he doesn’t hide the money he grifts from the sick and the hopeless, he broadcasts it. He brags it. He is loud with it. He is the picture of vulgar American excess. * * * “Formula 5” is alcohol, glycerol, carbolic acid, ground watermelon seed, corn silk and clover leaves. It is administered by injection at the site of the cancer —up to seven times a day. Baker stole the recipe from another con man. It does nothing. * * * Illness is an island. There is no loneliness like sickness. You feel it in these rooms. Not isolation, but desolation. * * * The Crescent, restored, is still open and more popular than ever. In part because it bills itself as America’s Most Haunted Hotel. On the ghost tour they tell you these stories: Michael, the stonemason flirt from Ireland falling to his death during the hotel’s construction, now and forever romancing guests in Room 218; Breckie, the 4-year-old boy taken up by illness a hundred years ago, lately playing in the hall and photobombing tourist selfies and snapshots; Theodora, 80 years dead and still trying to find her key in front of Room 419, an obsessive-compulsive packer of guest suitcases when angry; the Nurse, pushing a gurney through eternity on the third floor; and perhaps most famously, the Girl in the Mist, who from time to time to time—always around 10:30 in the moonlight—will fall or fling herself from one of the east side balconies into the garden below. She is shrouded in mist as she falls, as is her story. There’s no record of an event like this at the hotel, but folks claim to have seen the ghost, a white radiance plummeting from darkness to darkness, and they wonder...was she pushed? Did she jump? Was she—is she—a character straight out of Dreiser, pregnant and unwed and ruined, killed by despair and Victorian convention? Here the implausible yields to history, as the hotel was, for a few years anyhow, a distinguished college and conservatory for young women. From 1908 until 1924 the hotel bolstered its summer tourist revenue by becoming a destination boarding school for the edification of American womanhood. According to the Encyclopedia of Arkansas, an early advertisement in Cosmopolitan magazine reads as follows: “Crescent College and Conservatory for Women. On top of the Ozarks. Famous for healthfulness and beauty of location. $300,000 fireproof building. Rooms with private bath. Elevator. Accredited Junior College. General courses; art, music, expression, domestic science.” But there’s no evidence that any fallen or falling woman was a student. * * * On the fourth floor, in what used to be the faculty lounge, is now a small museum and library where the history of this place is made vivid. Historian and artist and storyteller Rebecca Becker keeps the memory of the conservatory days alive. There are exhibits and display cases and even a small stage. And Keith Scales is in charge of the ghost tours that begin here, and the ghost stories, and they work together to make the hotel the lively old dame she is. The Ozarks are a crossroads, and over the last 10,000 years or so Eureka Springs has attracted pilgrims and seekers of every kind. It was said the waters healed the sick and made you whole again, so the Osage chief who carved out the holy basin next to the spot where the Basin Park Hotel now stands on Spring Street would recognize the impulse if not the architecture. This has always been a boom-and-busttown and a hide-out and a rest cure, a place to lay low or live high. Bank robbers, bootleggers, railroad magnates and cathouse madams all made their fortunes here. Gangsters from Chicago kept the hotels and roadhouses and nightclubs busy after World War II, and when the local economy swooned in the 1970s, hippies saved the place and its storied gingerbread homes and it became an arts oasis. It remains so. Over the years, Scales has performed a one-man Norman Baker show many times on that small upstairs stage. He and Becker do their best to make the Crescent a history project rather than merely a haunted house. In addition to the ghost tours and the haunted hotels and the springs and the baths, Eureka Springs has galleries and restaurants and bars and a beautiful old Carnegie library. There is a Passion play here as well, and an opera company, and the Christ of the Ozarks statue, seven stories high. * * * The eroded Ozarks Plateau upon which all this rests is hundreds of millions of years old. Walk ten feet into the trees and you’re part of prehistory. The woods resonate with that ancient animism. The forests hum with it. You can hear them breathe. There’s something heavy in the quiet here, though, some weight to the air. Especially at sundown, or right before dawn when the mist hangs in the cuts and hollows. There’s something eerie in it, dreamlike, something deep and uneasy. In the hotel, it’s the same. Every stairwell and landing is fraught, every empty hallway feels crowded, and every room you enter feels as if someone just left. You are alone but never alone here. And to the suggestible, to the willing, that’s the point of the place. To be a little frightened by the supernatural. To feel alive in the company of the dead. In the dark, everything is a ghost story. But even the skeptical are moved here—if not to fear then to sadness. The sharp historical sense of dread and of pain and of loss can be overwhelming if you open yourself to it. Because worse than any monster is a man like you or me. Weak. Greedy. Unaccountable. Selling hope to the hopeless. If you checked in to the Baker Hospital, your room holds everything you ever lost. * * * The morgue smells of dust. It smells of dampness and dust and stone. It holds the heat of the day in the front room, but the farther in you walk, the cooler it gets. The morgue is not in the basement, but it is dark and low-ceilinged and you imagine the weight of the hotel on top of you. The refrigerator where they kept the bodies is the last stop on the ghost tour. In an alcove are shelves with more of the specimen and sample jars of the kind that were found down the hill. Many of these hold indeterminate bits of flesh and were displayed in the lobby of the hotel when it was the cancer “hospital.” There was a “ghost sighting” in this room—right where you’re standing—on a recent ghost-hunters’ TV show. It plays on a loop behind you. There’s a commercial kitchen utility counter and sink against one wall, which some people mistake for an autopsy table. The morgue was probably a pantry, but for the few years of Norman Baker’s Cancer is Curable Hospital, the dead were brought down here to wait for the hearse from the local funeral home. On the day Germany invaded Poland, September 1, 1939, Norman Baker was arrested for mail fraud. That’s how they got him: the brochures. He was tried and convicted and sent to Leavenworth. Norman Baker was a charlatan in the long terrible tradition of American charlatans. A liar and a thief who sold broken promises. Norman Baker died a Florida retiree in 1958. He had cirrhosis. And cancer. * * * “I think they’re pig parts,” Scales says of the contents of the specimen jars. Not surgical excisions. Not even human bits cribbed from the local mortician. Pickled trims and ends bought from the local butcher, maybe; or specimens ordered from a medical school supply house. We’ll get the results once the university tests them. But the jars will make no sense even then, even as an advertising device. Baker claimed to cure cancer without surgery. Why bother displaying fake tumors? Because people desperate to be persuaded don’t think. They want to be convinced. They want to be sold. They want to believe. Because they ache to live. And maybe that’s what you get from coming here, from every one of these ghost stories. A lesson in longing. Life persists. Life insists on life. So maybe in the middle of that first long night you wake. You fumble for the light and walk to the window. Behind you the old hotel creaks and groans. Outside, the invisible valley is all silence and darkness. What you see on the glass is your own reflection. The only ghost in this room is you.
5fa0131bcf54a09acfd8a50503b5474c
https://www.smithsonianmag.com/history/charming-story-homer-cummings-harold-israel-180961429/
The Suspect, the Prosecutor, and the Unlikely Bond They Forged
The Suspect, the Prosecutor, and the Unlikely Bond They Forged As attorney general of the United States in the 1930s, Homer Cummings announced the capture of Bruno Hauptmann in the kidnapping and murder of the Lindbergh baby. He built Alcatraz, the island prison. In the time of John Dillinger, Pretty Boy Floyd and Bonnie and Clyde, he consolidated federal investigative units into what became the FBI. He fought incessant battles for New Deal legislation. And he was instrumental in one of the century’s great scandals, President Franklin D. Roosevelt’s disastrous attempt to pack the Supreme Court. In fact, Cummings was chief architect of the plan, which was widely condemned; its true purpose of manufacturing a friendlier Supreme Court was buried under talk of judicial efficiency. The Politics of Upheaval: 1935-1936, The Age of Roosevelt Within the legal profession, though, Cummings is remembered for what he did as a county prosecutor in the 1920s. His performance in a Connecticut murder case so moved Felix Frankfurter that the renowned Supreme Court justice declared it “will live in the annals as a standard by which other prosecutors will be judged.” And so it has, providing a historical counterpoint to the present day, when stories abound of prosecutors who have lost their way, who do anything to win a conviction, who place politics above principle. But what happened in that case is only Act I in an extraordinary drama. Act II, untold until now, has been sitting in an archive at the University of Virginia for more than 40 years, tucked among 171 linear feet of Cummings’ shelved papers. Connecticut, in the 1920s, did not hang the condemned by having him fall. The executioner had the condemned stand on an iron plate, noose around his neck, to be yanked skyward courtesy of a contraption called the “upright jerker.” Why Connecticut eschewed a simple act of gravity for a system of weights and pulleys is not entirely clear (the patent obtained by an earlier warden might explain it), but the upright jerker loomed over the proceedings when, on May 27, 1924, Homer Cummings entered a Fairfield County courtroom and argued before the Honorable Waldo Marvin. If your Honor please: I would like to call to your attention this morning the case of State versus Harold Israel. Inside a courthouse that looked like a castle, Cummings spoke for the next hour and a half. He was a polished orator. He had graduated from Yale, and had debated against Harvard. On the 15th day of February, 1924, this accused was bound over to this court by the City Court of Bridgeport on a charge of murder... Cummings had been the Fairfield state’s attorney—the chief prosecutor, appointed by judges—for the past ten years. Now 54, he had nursed political aspirations since he was in his 20s. Three times he had run for Congress or the U.S. Senate and lost. Four years before, as chairman of the Democratic National Committee, he had delivered the keynote at the party’s national convention in San Francisco. The Chicago Tribune described him as “tall, rotund, but not grossly rotund” and “bald but not grotesquely bald.” His blue eyes were now framed by gold pince-nez glasses clipped to his long nose. ... in the first degree, growing out of the death of Reverend Hubert Dahme, pastor of St. Joseph’s.... The victim was a Catholic priest in a city full of Catholics. Father Dahme, a 56-year-old German native, had built a convent and a school in Bridgeport. The Easter before, he had laid the cornerstone of a $100,000 church. Twelve thousand mourners packed the funeral. He had been shot while taking a stroll downtown on February 4. At Main and High, amid a stretch of grand theaters, a man had approached Dahme from behind and fired one bullet from a .32-caliber revolver into his head at 7:45 p.m., as streetlights burned and theatergoers scurried about. Minutes before, Ethel Barrymore had passed by on her way to the New Lyric, where she was performing in The Laughing Lady. On account of the tragic nature of this murder, the well-frequented spot where it occurred and the prominence of the victim, an unusual amount of public interest was aroused.... This was, in cop parlance, a heater case, the pressure to solve it great. “The most shocking crime of its kind in the history of Bridgeport,” the mayor called it. Rewards totaling some $2,500 were offered for the killer’s capture. Still, days passed without an arrest. A week after the shooting, a police officer in Norwalk, west of Bridgeport, saw a young man he thought to be acting strangely. It was after 1 a.m. The man identified himself as Harold Israel. He said he had neither money nor a place to sleep and was making for home, in Pennsylvania. Police found in his possession a .32-caliber revolver. Israel was arrested for having a concealed weapon and hustled to city court, where he was fined $50 and sentenced to jail. When a Bridgeport Times reporter learned of the arrest (.32-caliber? Hmm), the newspaper contacted a Bridgeport police captain, who dispatched two detectives to interview Israel and two other detectives to examine the gun, which, they discovered, had four chambers loaded—and one empty. Israel told police that on the night of the murder, he was at the movies, alone. The police considered him an “arch liar,” the Bridgeport Times reported. Israel’s former landlady called him “a rather queer sort of fellow.” The paper instructed readers: “If you have any information or are of the opinion that you saw a suspicious character that may be Harold Israel,” contact police. “You may help to solve the most brutal murder in the history of Bridgeport.” As Israel was being questioned, eyewitnesses came in and implicated him. The interrogation continued until, after 28 hours, he confessed. Three weeks after Father Dahme’s death, the coroner summarized the evidence against Israel: A waitress said she had seen him walk past her restaurant minutes before the shooting took place a block away. Four witnesses said they had seen him after the shooting, fleeing the scene. A ballistics expert said the bullet recovered from Father Dahme’s head had been fired from the revolver found on Israel. And then there was Israel’s confession, oral and written. The case against the accused seemed overwhelming. Upon its face, at least, it seemed like a well-nigh perfect case.... The evidence had been described by those who believed in the guilt of the accused as “100 percent perfect.” In 1924, the criminal justice system’s flaws were not understood in the way they are today, now that DNA has exposed so many wrongful convictions. Little had been written about false confessions, mistaken eyewitnesses or bogus forensics. The year before Israel’s arrest, Learned Hand, an esteemed federal judge in New York, dismissed the very idea that an innocent person could be convicted, calling it “an unreal dream.” Cummings took close to half an hour to describe the evidence pointing to Israel’s guilt. Then, unexpectedly, he said: Despite these facts, however... When the lives of Harold Israel and Homer Cummings intersected in 1924, the two men came from different generations and worlds. Israel, Cummings’ junior by more than 30 years, was born in 1903 in Mount Carmel, Pennsylvania, a small town in coal country. He was the youngest of at least five kids, according to 1910 census records. His father, John, was a miner. Harold’s mother, Wilhelmina, called Minnie, was born in Germany. She died at 39, when Harold was 5. “Exhaustion,” her death certificate said. Later, Harold would be unable to summon her maiden name. One descendant says: “Harold was literate. Granted, I don’t think he graduated from high school. My dad thinks he got kicked out of the house and kind of sold off to another family to help them. They were poor as well.” Harold grew up to be thin and quiet. When arrested in Connecticut, Israel filled in some of his story, saying he had served in the Army, stationed in Panama. After his discharge he had made his way to Bridgeport to join a friend from the military. Israel had about $300 when he arrived. When the money was spent, he struck out for home. Homer Cummings, an only son, was born in 1870, to a life of advantage. His mother, Audie, was a Knickerbocker, descendant of a well-known line of Dutch settlers in New York. His father, Uriah, was a successful inventor, historian and specialist on the American Indian. He owned a cement mill in Akron, New York, capable of producing 400 barrels a day. His family had come to Massachusetts from Scotland in 1627. Homer grew up in Buffalo playing baseball, tennis and lacrosse, his mother “talented and beautiful,” his father “one of the kindest men in the world,” he told the Buffalo Evening News. His neighbor was an eminent architect, his friends were future physicians and lawyers. After graduating from Yale in 1891, Homer remained to study law, graduating again in 1893. Four years later, when he was 27, he married Helen Woodruff Smith, daughter of a New York banker. The couple wed aboard the banker’s 108-foot yacht, a mile out on Long Island Sound, with orchestra on deck and pleasure vessels all around, cannons booming, the bride’s veil pinned with a diamond star, the wedding ring a constellation of diamonds, emeralds, rubies and sapphires. Homer’s could have been a gilded life. But in his 20s he switched from Republican to Democrat. In Connecticut, “Democrats were so scarce that one who could sign his name, made a habit of blowing his nose and had not murdered his mother automatically became a party leader,” read a political report from that time. Cummings, having not murdered his mother, became a party leader. He failed to win national or state office, but did win three terms as mayor of Stamford, a Republican stronghold. He was 30 when first elected. This article is a selection from the January/February issue of Smithsonian magazine As mayor, Cummings was a progressive, pursuing safety regulations, investigating slaughterhouses, breaking the local utility’s monopoly. In 1905 he allowed several Italian societies to hold a Sunday picnic—beer, fireworks and all—in a prosperous part of town. “When it became known that Mayor Cummings had licensed a Sunday picnic the Puritanical element of Stamford was horrified,” the New York Times reported. After ten years of marriage and one child, a son, Homer and Helen divorced. When the couple had wed, the Times described him as “one of the most brilliant young lawyers and politicians in the state of Connecticut.” Now, the paper said he had been a “struggling young lawyer” when the two joined fortunes, and that “his rise, especially in politics, is regarded as due largely to his wife’s efforts.” Then the publicity got worse. A young man sued Helen for breach of promise of marriage, saying that when she was married—and he was 18—they had begun an affair. Love letters were passed. A newspaper published dozens of them. But in 1911, when the case was tried, the jurors found for Helen, unable to discern in her many letters any promise of marriage. When Homer stood in that Connecticut courtroom in 1924 in the Harold Israel case, he was 13 years removed from those mortifying newspaper stories—and remarried, to the heiress of a silk fortune. Despite these facts, however... Some people had doubts about Israel’s guilt, Cummings told the court. So Cummings had elected to investigate on his own. He interviewed every witness. He stood where they stood when they saw what they’d seen. He interviewed Israel, in the presence of Israel’s public defender. He studied the police reports, consulted experts and walked the crime scene. It goes without saying that it is just as important for a state’s attorney to use the great powers of his office to protect the innocent as it is to convict the guilty. Cummings told the court what he had learned: At the waitress’s restaurant, there was a glass partition inside the front window. The two panes were separated by a couple of feet, with a light between. These double windows created distortion, making it “very difficult” to make out the features of any person on the other side. He also noted that when he had interviewed the waitress, “she was by no means certain of her ground.” The prosecutor also found reason to doubt the four witnesses who reported seeing Israel fleeing. One said the shooter had used a black pistol that did not shine. Israel’s revolver was nickel-plated, Cummings told the judge. Under electric lights, it likely would have glinted. Cummings had recreated the conditions—the distance, the lighting—reported by two other witnesses, and said he could not even identify a person he knew well, much less a stranger. The fourth witness’s account suffered from a “tinge of the imaginative” and changed on second telling. Instead of relying on the single ballistics expert used by police, Cummings asked six others to compare the mortal bullet with Israel’s gun. These experts had studied at Harvard, Yale, MIT and had worked for Winchester, Remington, the New York City Police Department. All six concluded that Israel’s gun had not fired that bullet. Cummings had asked three physicians to examine Israel two days after his confession. They found him to be a docile man, particularly vulnerable to suggestion, physically and mentally spent, incapable of saying anything dependable. Later, his condition restored, he reasserted his innocence, saying he’d confessed just to get a rest. All three physicians believed his confession held no value. As for Israel’s alibi, the theater he claimed to be in had been showing four short movies on a loop. Israel had described what was showing at 7, when he entered, and at 9, when he left—and the theater’s manager had confirmed his account. “I do not think that any doubt of Israel’s innocence can remain in the mind of a candid person,” Cummings told the judge. The state’s attorney said he wished to enter a nolle prosequi—a Latin term used to mean, “We shall no longer prosecute.” He wanted to drop the murder charge. Judge Marvin commended Cummings for his “painstaking care” and granted his request. In years to come, writers would describe Israel’s reaction in court to the judge’s order. One described his “quivering lips,” another his bursting “into tears,” yet another his “hysterical joy.” Alas, these writers suffered their own tinge of the imaginative—for Israel was not in court that day. He did not see his life being saved by a prosecutor who blew up the police’s case, Bridgeport’s daily newspapers reported. Israel was apprised later in jail, where he was finishing his time for carrying a concealed weapon. Upon hearing the news he said simply, “That’s good. It came out right,” the Bridgeport Times reported, adding: “Israel is going back to Pottsville, Pa. He will not be found again, he says, carrying concealed weapons and he is going to try to be a hard working boy living at home among his friends and neighbors.” Within a few days, friends rounded up the money to pay Israel’s outstanding court costs. He was then taken to the train station and sent home. A prosecutor who bucked the police and backed a suspect—a man passing through, a man without means—seemed to invite recrimination. But immediately after the hearing, the police superintendent said his department accepted Cummings’ conclusion “without question.” The local press lauded Cummings’ “brilliant presentation” and “masterly analysis.” A law journal published his entire statement. In time, it became required reading for lawyers in the U.S. Department of Justice. After he stepped down as prosecutor later that year, the Fairfield County Bar held a banquet in his honor. Nine years after Learned Hand mocked wrongful conviction as “an unreal dream,” Yale law professor Edwin Borchard published Convicting the Innocent, a book with 65 instances of such. The Israel case was not among the 65 because he was never convicted, but Borchard cited it in his introduction to note the danger of false confessions. Cummings, in his mid-50s, settled into private practice, focusing on corporate law at Cummings & Lockwood, a firm he had formed with a friend. In 1932, he attended the Democratic National Convention as a delegate and delivered a stirring seconding speech for Franklin Delano Roosevelt, who, once elected president, named him attorney general. Cummings held the post for nearly six years. The historian Arthur M. Schlesinger called Cummings “a man of genuine ability, wily in the law, experienced in politics, courageous and tough.” Other historians questioned Cummings’ legal acumen while noting his “ferocious appetite for bureaucratic power” and accused him of turning Justice into a “patronage reservoir.” Although he expanded the Justice Department’s reach, he faced criticism for not expanding it enough. When he refused to apply a federal kidnapping law to lynchings, Walter White, head of the NAACP, wrote Cummings: My dear Mr. Attorney General: We have read with interest the Associated Press dispatch of December 21 that you ordered the Bureau of Investigation of the Department of Justice to find a cloak which Mrs. Campbell Pritchett lost at a party given by you and Mrs. Cummings. Has the Bureau found Mrs. Pritchett’s cloak yet? If so, may we inquire if it would be possible for you to assign the operatives thus freed by completion of that job to investigate the interstate kidnapping and subsequent lynching of Claude Neal. His personal life continued to make news. In the late 1920s, his second marriage ended in a Mexican divorce. His third marriage, to Cecilia Waterbury, was, forgive the cliché, charmed. In 1931, Homer and Cecilia spent two months touring the Mediterranean. Homer wrote a travel memoir, The Tired Sea, describing how the couple picnicked in Beirut, survived high seas in Malta, and in Jerusalem dined with Gene Tunney, the famous boxer and one of Cummings’ closest friends. In Washington, Cecilia’s “quick wit and keen intelligence” cut “a bright path across the capital’s social scene,” the New York Times reported. At the beginning of 1939, Cummings stepped down as head of the Justice Department. Seven months later, Cecilia died, leaving Homer, 69, alone. On July 26, 1946, a Friday, at a little before 5 in the morning, Harold Israel got on a train in Pottsville, Pennsylvania. He rode to Philadelphia, got off and hopped another train to Washington. At about 11 a.m. he arrived in the capital, then made his way to 1616 K Street Northwest, a few blocks from the White House. There, he prepared to see Homer Cummings for the first time in 22 years. Israel was now 43. He did not know what this was about. All he knew was that a special agent with the FBI had contacted him to say that Cummings wished to meet. Cummings was now 76. After leaving the Justice Department he had returned to private practice, working in Washington, where he owned an English Tudor with a library and a butler’s pantry. By now he was remarried, to Julia Alter, a newspaper writer. Since Connecticut, contact between Cummings and Israel had been fleeting. In 1941, they had exchanged brief letters. “Dear Friend,” Israel wrote first. “Just a few lines to let you know that I am well and that this will leave you the same. I guess you think that I have the nerve to write to you for what you have did for me. But you see I have to write to some one.” Israel’s letter said he was out of work and “on the relief.” He had little to live on and wasn’t sure what to do. Cummings replied a week later, saying how pleased he was to hear from Israel. But, his letter said, “I do not know at present what I can do.” Five years later, in the spring of 1946, Cummings saw an opportunity to help. He received a telephone call from a filmmaker, Louis de Roche­mont, who said he was considering producing a movie about the Israel case for 20th Century Fox. The filmmaker asked: Do you know where Israel can now be found? When Cummings was attorney general, his FBI director was J. Edgar Hoover. Hoover was still there (and would be for decades to come), so Cummings reached out, asking for information on the man he had once saved. On May 27, 1946, Hoover wrote to share what his agents had learned. Israel was living in Gilberton, another Pennsylvania coal town. He was working for the Philadelphia and Reading Coal and Iron Company, where he was “well accepted and highly regarded.” He had two boys, ages 19 and 13. The older one was serving in the Navy. Cummings wrote back, pressing for details. Was Israel’s wife alive? How much was he paid? What was his house worth? Hoover answered: Israel’s wife of 20 years, Olive Mae, was alive and living with him. He worked seven days a week, for $60 per week. His house, an “extremely modest” duplex on an unpaved street, was worth about $700. He oiled machinery (“a good, dependable employee”); was a “constant attendant at the Gilberton Methodist Church of which he is a Trustee”; and was a “family man...not known to ever frequent the tap rooms.” Thus informed, Cummings began negotiating with 20th Century Fox. With the help of his firm, he researched trusts, bonds and taxes. When Israel showed up in his office on July 26, Cummings shared the results of his labor. He told Israel that the film company was paying him $18,000 for the rights to his story. Cummings had set aside $6,500 for Israel’s anticipated tax hit. He had invested $8,995 in U.S. savings bonds in Israel’s name. The rest, $2,505, was made out in a check, which Homer handed to Harold. In today’s dollars, that $18,000 would be worth about $222,000. Cummings also negotiated a deal for his own life rights, securing $10,000—which he donated to the George Washington University Hospital. Israel, check in hand, left for Gilberton. A few days later, Cummings received a letter from Olive Israel, describing what happened when Harold’s train pulled in. “When we met him at the station and he got in the car I said ‘Are you O.K. and what did Mr. Cummings want you for.’ He had a big smile and said, ‘I’m all right and we have plenty of money.’ I said ‘how much.’ When he told me I almost fainted. I couldn’t believe him so he told them to stop the car and he showed me the check first, then the paper with all the bonds.... “We can’t begin to thank you enough for what you have done for us,” Olive wrote. Harold, she said, “has been a perfect husband & father....He has worked hard and was always willing to do anything.” Now he could afford to fix up their house. Now he could do something about their 13-year-old Ford. Even before this, Olive wrote, Harold had told her how much Cummings meant to him. “To him Mr. Cummings you are next to God. He worships you. He said he would trust you more than anybody in this world.” In Washington, Israel had told Cummings he’d been hurt by his depiction in a recent Reader’s Digest story reconstructing the Connecticut case. The story had referred to him as a “penniless tramp” and a “vagabond.” Cummings knew the story’s author, Fulton Oursler. (He later wrote The Greatest Story Ever Told, the best-selling biography of Jesus.) Oursler was involved in this movie. So Cummings wrote to him, saying Israel was a respectable, hard-working family man with a “distinct sense of pride and self-respect.” Israel was never a tramp, Cummings wrote, and he was “leaning heavily” on Oursler to ensure the movie did not cast him so. In August, Olive wrote Cummings to say Harold had bought a 1940 Chevrolet for $800 and planned to build a bathroom in their house. They had ordered a refrigerator because food didn’t keep long in their icebox. They also hoped to get a porcelain sink for the kitchen, dental care for Harold and Olive and some new clothes. “Mr. Cummings I don’t think it is extravagant to try to buy these things that we wanted all our life and could never get until you made it possible, do you?” she wrote. Cummings wrote back a few days later, saying the purchases seemed “entirely justified. I hope that you and your family will derive great comfort and happiness from these expenditures....When I last saw Harold he spoke of the need of dental work. This, I think, is very important, as health in large measure depends upon well-looked-after teeth.” Olive replied to this letter, and Homer replied to hers, and Olive returned that one, and over months, then years, a few letters became dozens. The correspondence grew less formal, the families sharing ailments (Homer, a blood clot in his left arm, Harold, a bad cold) and talk of weather (“it is supposed to snow 5 inches today”). Olive provided updates on the couple’s two sons, on Freddie getting married (“I’d rather if he waited until he is older but...I guess if they love each other that’s all that matters”) and having a daughter (“I don’t think we could have picked a prettier baby if we tried to pick from a million babies”), and then another daughter, and on Bobby making JV basketball, then varsity football and baseball, then joining the Army and serving in Germany, then France. Harold and Olive sent cards, and Homer sent gifts: ties for the men; perfume for Olive; a sweater outfit for Freddie’s first daughter; a coverlet crocheted by Homer’s wife for Freddie’s second daughter. Olive and Harold would ask for advice—on legal or financial matters, or about their sons’ career prospects—and Homer would oblige. In the letters, the difference in their circumstances is not remarked upon. Homer mentioned vacations in Florida and golfing in North Carolina. Olive described Harold’s routine of waking at 6, working until 2, coming home to pick coal or maybe work on his car, then, at night, listening to the radio and eating potato chips. On some Monday nights, they went to the midget auto races. In early 1947 the movie was released. Boomerang! was directed by Elia Kazan, later of On the Waterfront fame. It made the prosecutor a young man—less established, more vulnerable to pressure—and introduced corrupt political forces for added drama. But it remained largely faithful to fact and depicted the accused sympathetically. In the movie, as in real life, the priest’s murder was never solved. (In Bridgeport, some police continued to believe Israel was guilty.) Homer called it “rather stirring” and “essentially sound.” In time, the friendship moved beyond letters. Harold and Olive invited the Cummingses to visit—and in the summer of 1947, Homer and Julia set out for Pennsylvania. The story of how Olive prepared for their arrival—of how determined she was to make a good impression—would be told in the Israel family for decades. Harold and Olive had a small mutt that was getting up in years. Olive worried that its coat was too gray. So she kicked everyone out of the house and dyed the dog’s fur. In 1952, Olive wrote that work for men in Gilberton had become slack, with coal operations shutting down. She tried working in a factory, sewing cuffs on shirts, but couldn’t take the smell of oil and “everybody grabbing, hurrying.” When Homer asked about the bonds, Olive said they had spent all of the money before the bonds matured. Homer told her he understood: The family had been under a lot of pressure. In Christmases to come, he would send especially nice gifts and, on occasion, money. In 1955, Julia Cummings, Homer’s fourth wife, died. She was 49. Her obituary said she had suffered from high blood pressure. His son had died two years before. In July 1956, Homer exchanged letters with Harold’s family one last time. In September he died at home, at the age of 86. His house in Washington, the English Tudor, was sold the following January. The buyer was the country’s new vice president, Richard Nixon. In Connecticut, Homer Cummings’ name is still attached to Cummings & Lockwood, which now has 70 attorneys in six offices. A park in Stamford, on Long Island Sound, is named for him. People go there to play tennis or picnic or watch Fourth of July fireworks. Cummings’ name is also on an award, given to a Connecticut prosecutor who exemplifies his principles. Kevin Kane, the chief state’s attorney of Connecticut, says the award helps “make sure that we don’t forget what our role is”—to do justice and to represent all the people. In 2008, Kane became convinced that two men arrested in the murder of a well-known energy scientist were innocent; he went into court and moved to have the charges dismissed. “And I did think during that, ‘What would Homer Cummings have done with a case like that?’” Harold died in 1964, at the age of 60. It was winter, with snowdrifts piled up to car windows, but the coal company pulled out its heavy equipment and plowed roads all the way up to the house so mourners could visit and view Harold’s body in the parlor. “They had a good crowd there,” says Harold’s granddaughter Darlene Freil. Harold and Olive had six grandkids and 13 great-grandkids, in a family tree that continues to grow. Darlene remembers that, quiet as Harold was about all he’d been through, Olive never tired of talking about Homer Cummings. Olive had a keen sense of legacy. She often told her family: If things had gone differently in Connecticut, none of you would be here. This story is published in collaboration with the Marshall Project and includes reporting by Lisa Mullins and Lynn Jolicoeur of WBUR-FM in Boston, a National Public Radio affiliate. Ken Armstrong is an investigative reporter with a share of four Pulitzer Prizes, including 2016’s award for explanatory journalism. He is currently a staff writer for the Marshall Project, a nonprofit news organization covering criminal justice.
07b922908c9d74296d88780bdac0737b
https://www.smithsonianmag.com/history/church-unearthed-ethiopia-rewrites-history-christianity-africa-180973740/
Church Unearthed in Ethiopia Rewrites the History of Christianity in Africa
Church Unearthed in Ethiopia Rewrites the History of Christianity in Africa In the dusty highlands of northern Ethiopia, a team of archaeologists recently uncovered the oldest known Christian church in sub-Saharan Africa, a find that sheds new light on one of the Old World’s most enigmatic kingdoms—and its surprisingly early conversion to Christianity. An international assemblage of scientists discovered the church 30 miles northeast of Aksum, the capital of the Aksumite kingdom, a trading empire that emerged in the first century A.D. and would go on to dominate much of eastern Africa and western Arabia. Through radiocarbon dating artifacts uncovered at the church, the researchers concluded that the structure was built in the fourth century A.D., about the same time when Roman Emperor Constantine I legalized Christianty in 313 CE and then converted on his deathbed in 337 CE. The team detailed their findings in a paper published today in Antiquity. The discovery of the church and its contents confirm Ethiopian tradition that Christianity arrived at an early date in an area nearly 3,000 miles from Rome. The find suggests that the new religion spread quickly through long-distance trading networks that linked the Mediterranean via the Red Sea with Africa and South Asia, shedding fresh light on a significant era about which historians know little. “The empire of Aksum was one of the world’s most influential ancient civilizations, but it remains one of the least widely known,” says Michael Harrower of Johns Hopkins University, the archaeologist leading the team. Helina Woldekiros, an archaeologist at St. Louis’ Washington University who was part of the team, adds that Aksum served as a “nexus point” linking the Roman Empire and, later, the Byzantine Empire with distant lands to the south. That trade, by camel, donkey and boat, channeled silver, olive oil and wine from the Mediterranean to cities along the Indian Ocean, which in turn brought back exported iron, glass beads and fruits. The kingdom began its decline in the eighth and ninth centuries, eventually contracting to control only the Ethiopian highlands. Yet it remained defiantly Christian even as Islam spread across the region. At first, relations between the two religions were largely peaceful but grew more fraught over time. In the 16th century, the kingdom came under attack from Somali and then Ottoman armies, but ultimately retained control of its strategic highlands. Today, nearly half of all Ethiopians are members of the Ethiopian Orthodox Tewahedo Church. For early Christians, the risk of persecution from the Romans sometimes ran high, forcing them to practice their beliefs in private, posing a challenge for those scholars who study this era. Christianity had reached Egypt by the third century A.D., but it was not until Constantine’s legalization of Christian observance that the church expanded widely across Europe and the Near East. With news of the Aksumite excavation, researchers can now feel more confident in dating the arrival of Christianity to Ethiopia to the same time frame. “[This find] is to my knowledge the earliest physical evidence for a church in Ethiopia, [as well as all of sub-Saharan Africa,]” says Aaron Butts, a professor of Semitic and Egyptian languages at Catholic University in Washington, D.C., who was not involved with the excavation. Harrower’s team conducted their work between 2011 and 2016 at an ancient settlement called Beta Samati, which means “house of audience” in the local Tigrinya language. The location, close to the modern-day border with Eritrea and 70 miles to the southwest of the Red Sea, appealed to the archaeologists in part because it was also home to temples built in a southern Arabian style dating back many centuries before the rise of Aksum, a clear sign of ancient ties to the Arabian Peninsula. The temples reflect the influence of Sabaeans, who dominated the lucrative incense trade and whose power reached across the Red Sea in that era. The excavators’ biggest discovery was a massive building 60 feet long and 40 feet wide resembling the ancient Roman style of a basilica. Developed by the Romans for administrative purposes, the basilica was adopted by Christians at the time of Constantine for their places of worship. Within and near the Aksumite ruins, the archaeologists also found a diverse array of goods, from a delicate gold and carnelian ring with the image of a bull’s head to nearly 50 cattle figurines—clearly evidence of pre-Christian beliefs. They also uncovered a stone pendant carved with a cross and incised with the ancient Ethiopic word “venerable,” as well as incense burners. Near the eastern basilica wall, the team came across an inscription asking “for Christ [to be] favorable to us.” In the research paper, Harrower said that this unusual collection of artifacts “suggests a mixing of pagan and early Christian traditions.” According to Ethiopian tradition, Christianity first came to the Aksum Empire in the fourth century A.D. when a Greek-speaking missionary named Frumentius converted King Ezana. Butts, however, doubts the historical reliability of this account, and scholars have disagreed over when and how the new religion reached distant Ethiopia. “This is what makes the discovery of this basilica so important,” he adds. “It is reliable evidence for a Christian presence slightly northeast of Aksum at a very early date.” While the story of Frumentius may be apocryphal, other finds at the site underline how the spread of Christianity was intertwined with the machinations of commerce. Stamp seals and tokens used for economic transactions uncovered by the archaeologists point to the cosmopolitan nature of the settlement. A glass bead from the eastern Mediterranean and large amounts of pottery from Aqaba, in today’s Jordan, attest to long-distance trading. Woldekiros added that the discoveries show that “long-distance trade routes played a significant role in the introduction of Christianity in Ethiopia.” She and other scholars want to understand how these routes developed and their impacts on regional societies. “The Aksumite kingdom was an important center of the trading network of the ancient world,” says Alemseged Beldados, an archaeologist at Addis Ababa University who was not part of the study. “These findings give us good insight ... into its architecture, trade, civic and legal administration.” “Politics and religion are important factors in shaping human histories, but are difficult to examine archaeologically,” says Harrower. The discoveries at Beta Samati provide a welcome glimpse into the rise of Africa’s first Christian kingdom—and, he hopes, will spark a new round of Aksum-related excavations. Andrew Lawler is author of The Secret Token: Myth, Obsession, and the Search for the Lost Colony of Roanoke. He is also a contributing writer for Science magazine and has written for The New York Times, The Washington Post, Smithsonian, National Geographic, and other publications. Website: andrewlawler.com
0649f98b7697ae410c5461576114f24a
https://www.smithsonianmag.com/history/civil-war-african-american-veterans-town-own-unionville-180964398/
After the Civil War, African-American Veterans Created a Home of Their Own: Unionville
After the Civil War, African-American Veterans Created a Home of Their Own: Unionville After the Civil War, 18 veterans of the United States Colored Troops returned to Talbot County, on Maryland’s Eastern Shore, where their families had toiled for generations. But this time, they had a chance to create something their ancestors had been denied: a village of their own, where everyone was free. It is believed to be the only village in the United States founded by formerly enslaved soldiers. And now, as it celebrates its 150th anniversary, it stands as a powerful symbol of resilience. The founders named it Unionville—a daring statement in that time and at that place. While Maryland had remained in the United States during the war, most of the landed gentry of Talbot County had been fiercely secessionist. Eighty-four sons of Talbot fought for the Confederacy; one of them, Franklin Buchanan, served as an admiral in the Confederate navy. The presence, after the war, of a free, black settlement, named for the hated Union, made a dramatic claim to equality and liberty. The passionate man labeled as the “most influential African American of the nineteenth century.” This is his voice. This is his story. It was the persistence of questions about race and justice in America that drew the photojournalist Gabriella Demczuk to Unionville in the summer of 2015. After documenting the killings of several unarmed black men around the country, she noticed that much of “the coverage we were seeing only perpetuated the negative stereotypes of black communities. I wanted to work on a story that celebrated black life.” Demczuk, who grew up around Baltimore, visited Talbot County as a young woman and heard about a history that her uncle, Bernard Demczuk, who was a George Washington University administrator and lecturer, was writing about Unionville. But only after the 2015 killing of Freddie Gray in Baltimore, she says, did she “finally pick up his book and learn about the town’s history.” The establishment of Unionville defied more than 200 years of Talbot County history: For generations, slavery was “part and parcel of the land,” Bernard Demczuk writes in his history. From the time the county was founded, in the 1660s, it depended on enslaved labor, and its plantation economy made a handful of white families quite rich. The Eastern Shore’s terrain, laced with creeks and rivers leading to the Chesapeake Bay, made it easy to send out tobacco, grain and other crops—and to bring in enslaved workers. But, as Bernard Demczuk told me recently, “The waterways that enslaved you could also free you.” Frederick Douglass (who once worked at the Wye House, a short walk from where Unionville now stands) and fellow abolitionists Henry Highland Garnet (from nearby Kent County) and Harriet Tubman (from Dorchester, one county south) all escaped enslavement and its astounding cruelty. Douglass, in his 1845 autobiography, describes an overseer whipping a laborer named Demby, then shooting him dead after he sought relief from his wounds by jumping into a creek. Once the Union began enlisting African-American troops, in 1863, some 8,700 black Marylanders seized the chance. (Some slaveholders accepted the Union’s offer of $300 per man to let them go.) After the war ended in 1865, eighteen black soldiers returned to Talbot County—including Charles and Benjamin Demby, relatives of the man whose murder Frederick Douglass described. In 1867, a Quaker couple, Ezekiel and Sarah Cowgill, who had always worked their Talbot plantations with paid labor, gave the veterans assistance that other landowners refused. The Cowgills began leasing half-acre lots to the 18, who would come to own them. The next year, the couple sold them a parcel for a schoolhouse, and then another for a church, which became St. Stephens AME. In time, 49 families called Unionville home. The village was an island of black self-determination in a sea of white resentment. Some of Talbot’s emancipated workers spent years in forced “apprenticeships,” prison work camps and other measures meant to perpetuate the old caste system. Maryland passed Jim Crow laws as early as 1870. Sporadic lynchings on the Eastern Shore began in the 1890s. In 1916, a monument to the 84 “Talbot Boys” who fought for the Confederacy went up outside the county courthouse in Easton, just a few miles from Unionville. Not until the civil rights movement of the 1970s, Bernard Demczuk says, did Unionville’s relationship with its surroundings begin to improve. The 18 founders now lie in the graveyard at St. Stephens, and the descendants of all but a handful of the 49 families have moved on. Unionville is majority, but not exclusively, black, and Talbot County is becoming popular as a tourist and retirement haven. Still, “there is a vision of Unionville,” said the Rev. Nancy M. Dennis, pastor of St. Stephens, “and that is sacred memories on hallowed ground.” Dennis was speaking on Memorial Day, when Unionville formally celebrated its sesquicentennial with a giant party featuring locals, people from neighboring towns, American Legion vets and marching bands. A dance company from Baltimore performed in Union blue regalia. A gray-haired white woman read a poem she wrote in the voice of an enslaved black man. Descendants of both the African-American founders and the white plantation owners for whom they’d toiled clapped, sang, marched, danced and feasted on crab cakes, chicken and waffles, shrimp, and crab rolls. As in New Orleans and Charleston, civil rights activists have pushed to remove Confederate monuments, including the Talbot Boys, from the county courthouse, arguing their presence casts a pall over the halls of justice. The county has declined. But in 2011, local officials added a statue of Frederick Douglass there. Bernard Demczuk said he thinks that’s about right, having the Talbot Boys and Douglass juxtaposed, “so we can have that conversation.” Bernadine Davis, 35, a member of St. Stephens and a descendant of Unionville founder Zachary Glasgow, said that conversation has yet to begin. “No one really talks about it,” she said. At the same time, the display of interracial fellowship at the sesquicentennial is now a way of life in Talbot County. “You do have your bickering and your arguing, but everyone is of one accord,” she says. “The majority of the black people of Unionville are family. The white people are family as well.” This article is a selection from the September issue of Smithsonian magazine Gabriella Demczuk is a photojournalist based between Baltimore and Washington, D.C. Natalie Hopkinson is a Washington, D.C.-based author and assistant professor at Howard University.
94ea8a407cb7f9d879f65f53053bd93d
https://www.smithsonianmag.com/history/civil-war-artifacts-in-the-smithsonian-539020/
The Civil War
The Civil War This well-preserved leather trunk, believed to have been kept for years in an attic before being sold at auction, once belonged to First Lt. George T. Garrison, son of the famous abolitionist William Lloyd Garrison and an officer with the 55th Massachusetts Infantry Regiment, an outfit of black volunteer soldiers. Lt. Garrison, a white officer leading a unit of black troops, fought in every battle “with an exemplary record” and was said to have led the troops through the streets of captured Charleston, South Carolina, in February 1865, singing the popular Union ballad the “John Brown” song. Garrison enlisted in the Civil War over the protestations of his father, who argued that slavery should be ended through peaceful means, says collections specialist Michele Gates-Moresi. “It was kind of this weird tension [William Lloyd Garrison] had; no compromise but also no war. Coming from this family, it was probably a big decision for [George Garrison] to participate in the war.” According to Gates-Moresi, secondary sources reveal that the senior Garrison later accepted the necessity of war. “He supported his son at the end.” “We thought that story was interesting, just to let people know that it’s not this sort of black-and-white story where there’s pro-slavery people and there’s abolition people,” Gates-Moresi says. “There’s lots of people in between who have these issues, and that’s a way to get at those subtleties and nuances of the history.” This trunk, “very much a period piece,” used for traveling, helps unpack those hidden stories and complicated truths. The National Museum of African American History and Culture is slated to open in 2015. by Arcynta Ali Childs
53f7eb5ca3142da2183b61f34d26f41a
https://www.smithsonianmag.com/history/clarence-darrow-jury-tamperer-109085/
Clarence Darrow: Jury Tamperer?
Clarence Darrow: Jury Tamperer? On a rainy night in Los Angeles in December 1911, Clarence Darrow arrived at the apartment of his mistress, Mary Field. They sat at the kitchen table, beneath a bare overhead light, and she watched with dismay as he pulled a bottle of whiskey from one pocket of his overcoat and a handgun from the other. [×] CLOSE Photo Gallery “I’m going to kill myself,” he told her. “They’re going to indict me for bribing the McNamara jury. I can’t stand the disgrace.” The great attorney had come to Los Angeles from Chicago to defend James and John McNamara, brothers and unionists accused of conspiring to bomb the Los Angeles Times, the city’s anti-union newspaper, killing 20 printers and newsmen. But jury selection had not gone well, and Darrow feared the brothers would hang. One morning a few weeks earlier, Darrow had taken an early streetcar to his office in the Higgins Building, the new ten-story Beaux-Arts structure at the corner of Second and Main Streets. At around 9 a.m. the telephone rang. Darrow spoke briefly to the caller. Then he picked up his hat and left the building, heading south on the sidewalk along Main. Meanwhile, his chief investigator, a former sheriff’s deputy named Bert Franklin, was two blocks away, passing $4,000 to a prospective member of the McNamara jury who had agreed to vote not guilty. Franklin, in turn, was under police surveillance: The juror had reported the offer to the authorities, who had set up a sting. Franklin now sensed that he was being watched and headed up Third Street to Main. There he was arrested—just as Darrow joined him. Franklin became a witness for the state, and in January 1912, Darrow was arrested and charged with two counts of bribery. With the help of another legendary trial lawyer, California’s Earl Rogers, Darrow was acquitted in one trial, and the other ended with a hung jury. He returned to Chicago broke and disgraced, but he picked up the pieces of his career and became an American folk hero—champion of personal liberty, defender of the underdog, foe of capital punishment and crusader for intellectual freedom. Darrow’s ordeal in Los Angeles 100 years ago was eclipsed by his later fame. But for a biographer the question is insistent: Did America’s greatest defense attorney commit a felony and join in a conspiracy to bribe the McNamara jurors? In writing a new account of Darrow’s life, with the help of fresh evidence, I concluded that he almost certainly did. The Los Angeles Law Library is on Broadway, across the street from the lot, now empty, where the bombing destroyed the Los Angeles Times building. The library holds the 10,000-page stenographic record of Darrow’s first bribery trial. It is a moving experience to page through the testimony so close to where the carnage took place. The McNamaras’ trial was cut short after six weeks when Darrow secured a plea agreement that would spare their lives. James McNamara pleaded guilty to murder in the Times bombing and was sentenced to life in prison; his brother pleaded guilty to a different bombing and was sentenced to 15 years. The agreement was still being finalized when Darrow’s investigator, Franklin, was arrested on the street for bribery. Darrow’s own trial was a legal hellzapoppin’. Rogers was skilled at baiting prosecutors and distracting juries with caustic asides and courtroom antics. (At one point he wrestled with the furious district attorney, who was preparing to throw a glass inkwell at the defense team.) Truth be told, the prosecution had a weak case. Aside from Franklin’s testimony, and Darrow’s presence at the scene on Main Street that morning, there was little corroborating evidence tying the attorney to the crime of bribery. And, in an astounding exchange, Rogers got Franklin to concede that prosecutors had promised him immunity; he had had his fines paid; and he had met covertly with California’s notoriously venal robber barons, who promised to reward him if he testified against Darrow. With eloquent closing arguments, Rogers and Darrow persuaded the jury that Darrow was in fact the victim—a target of rapacious capital, out to subdue labor. Darrow’s early biographers—the novelist Irving Stone (Clarence Darrow for the Defense, 1941) and Chicago’s Arthur and Lila Weinberg (Clarence Darrow: A Sentimental Rebel, 1980)—concluded their hero was most likely innocent. Geoffrey Cowan, an attorney and scholar who examined the first bribery trial in minute detail in his 1993 book, The People v. Clarence Darrow, reached a different verdict. Cowan weighed the number of Darrow’s contemporaries—friends, acquaintances and journalists who covered the trial—who believed he was guilty of arranging the bribe. They forgave Darrow, for the most part, because they shared his conviction that the vast power and wealth arrayed against labor unions, and the often violent and illegal tactics of corporations, justified such an extreme measure to spare the defendants. “What do I care if he is guilty as hell; what if his friends and attorneys turn away ashamed of him?” the great muckraker Lincoln Steffens wrote of his friend in a letter. Neither Cowan nor I found evidence of a conspiracy to frame Darrow in the files of the U.S. Justice Department, or in the papers of Walter Drew, the steel industry’s union-busting lobbyist, who had led and helped fund the case against the McNamaras. To write my story of Darrow’s life, I tapped university and courtroom archives at more than 80 institutions. Perhaps the most intriguing new evidence I found was in Mary Field’s diary. In researching their biography, the Weinbergs persuaded Field’s daughter to share segments of her mother’s papers, which included selections from her diary and correspondence from Darrow. The material offers a unique glimpse into the man: To Mary Field he poured out his feelings in evocative letters. Long after their affair ended, they remained loving friends. Field’s diaries are now at the University of Oregon, where I spent a week going through them page by page. Aside from Darrow’s wife, Ruby, no one was closer to him during his ordeal in Los Angeles. Field, a bold young journalist, was Darrow’s lover, friend, legal assistant, press agent and investigator. She never wavered, in private or public, from insisting he was innocent. But in a 1934 diary notation I found this passage: Read life of Earl Rogers and revive memories of 23 years ago—memories more vivid than those of a year ago. Memories burned in with red hot rods. Days when I walked through Gethsemane with Darrow, crushed and weighted with the desertion of friends, with betrayal, with the impending doom of jail...bribing a juror to save a man’s life...who knows if he did? But he wouldn’t hesitate anyway. If men are so cruel as to break other men’s necks, so greedy as to be restrained only by money, then a sensitive man must bribe to save. It is not conclusive. But I believe that it adds Mary to the list of Darrow intimates who suspected their hero was guilty. I uncovered another incriminating detail in one of Darrow’s long-lost letters. Irving Stone purchased the lawyer’s papers from his widow, and they were eventually donated to the Library of Congress. But not all the material in Darrow’s files made it to Washington, D.C. Hundreds of his private letters, unearthed by a collector named Randall Tietjen (many in a box marked “Christmas ornaments” in the basement of Darrow’s granddaughter), were made available to scholars by the University of Minnesota Law School Library in 2010 and 2011. And there I found a 1927 letter from Darrow to his son, Paul, instructing him to pay $4,500 to Fred Golding, a juror in the first bribery trial. I was stunned. Darrow was a generous soul. And it certainly is possible that Golding had fallen on hard times and asked for help, and that Darrow responded out of the goodness of his heart. But $4,500 was serious money in 1927—more than $55,000 today—and it’s difficult to imagine that Darrow would be that generous in response to a hard-luck story. And it should be noted that Golding was Darrow’s most outspoken defender on the jury. Golding took the lead in quizzing prosecution witnesses from the jury box, which was permitted in California. He openly suggested that the case was a frame-up orchestrated by California’s business interests as part of their infamous scheme (immortalized in the film Chinatown) to steal water from the Owens Valley and ship it to Los Angeles. To be sure, Golding may have been a harmless conspiracy theorist, and Darrow may indeed have conceived of paying him only after the trial. But the question demands an answer: Did Darrow bribe a juror while on trial for bribing jurors? If so, what does that say about his willingness to join in the McNamara bribery plot? “Do not the rich and powerful bribe juries, intimidate and coerce judges as well as juries?” Darrow once asked an associate. “Do they shrink from any weapon?” Finally, there’s a telegram Darrow sent. It was the philanthropist Leo Cherne who acquired Darrow’s papers from Stone and donated them to the Library of Congress. But in a collection of Cherne’s papers in the Boston University archives, there are several files of Darrow letters, telegrams and other sensitive documents that did not travel with the rest to Washington. Much of the correspondence in the Cherne collection is from the winter of 1911-12. The most intriguing item is a telegram Darrow sent to his older brother Everett the day he was indicted. “Can’t make myself feel guilty,” Darrow wrote. “My conscience refuses to reproach me.” He doesn’t say he is innocent—only that his conscience is clear. That was an important distinction for Darrow, for whom motive was the overriding question in defining an evil, a sin or a crime. Darrow’s great patron was Illinois Gov. John Altgeld, whom Darrow said admiringly was “absolutely honest in his ends and equally as unscrupulous in the means he used to attain them.” Altgeld “would do whatever would serve his purpose when he was right. He’d use all the tools of the other side—stop at nothing,” he said. “There never was a time that I did not love and follow him.” In both his trials Darrow pleaded not guilty, took the stand, swore an oath and testified that Franklin’s testimony against him was a lie. But in the telegram to his brother and other correspondence to family and friends, Darrow distinguishes between legal and moral guilt. “Do not be surprised at any thing you hear,” Darrow warned his son, in a note newly unearthed from the Minnesota files. But, he told Paul, “my mind and conscience are at ease.” Indeed, in his second trial, Darrow virtually dared the jury to convict him, making arguments that seemed to justify the McNamaras’ terrorist attack. Jim McNamara placed the bomb in the Times building, Darrow told the jury, because “he had seen those men who were building these skyscrapers, going up five, seven, eight, ten stories in the air, catching red hot bolts, walking narrow beams, handling heavy loads, growing dizzy and dropping to the earth, and their comrades pick up a bundle of rags and flesh and bones and blood and take it home to a mother or a wife.” Darrow went on,“He had seen their flesh and blood ground into money for the rich. He had seen the little children working in factories and the mills; he had seen death in every form coming from the oppression of the strong and the powerful; and he struck out blindly in the dark to do what he thought would help....I shall always be thankful that I had the courage” to represent him. After hearing that, the jurors told reporters, they were convinced that Darrow would surely resort to bribery, and other illegal acts, to defend or advance his beliefs and clients. How should we judge Darrow? He left Los Angeles in 1913 a changed man. “The cynic is humbled,” his friend Steffens wrote. “The man that laughed sees and is frightened, not at prison bars, but at his own soul.” After he returned to Chicago, he rebuilt his practice and his reputation by taking cases that other lawyers would not touch. Mentally ill men accused of heinous crimes. Black men charged with raping white women. Communists and anarchists snared in the reactionary fervor of the Red Scare. He defended Frank Lloyd Wright when federal prosecutors hounded the architect for violating the Mann Act, which made it a crime to transport women across state lines for “immoral purposes.” He saved the killers Nathan Leopold and Richard Loeb from the gallows. Most famously, he scored a triumph for academic freedom after John Scopes was accused of violating a Tennessee law that prohibited the teaching of evolution. “The marks of battle are all over his face,” the journalist H.L. Mencken wrote. “He has been through more wars than a whole regiment of Pershings....Has he always won? Actually, no. His cause seems lost among us. “Imbecilities, you say, live on? They do,” wrote Mencken. “But they are not as safe as they used to be.” A biographer must assess a subject’s good and bad—all the black, white and grays of character. And it was Darrow’s actions in another case, largely neglected by previous biographers, that finally put me, firmly, on his side. In 1925, in the wake of the Scopes trial and at the height of his fame, when Darrow sorely needed money and could have commanded titanic fees on Wall Street, he declined to cash in. He went, instead, to Detroit, to represent the Sweet family, African-Americans who had fired into a racist mob that attacked their new home in a white neighborhood. It was the summer of the Klan—when thousands of hooded bullies marched down Pennsylvania Avenue in Washington. Darrow defended the Sweets in two grueling trials that spanned seven months, for a token fee raised by the NAACP. He won the case, establishing a principle that black Americans had a right to self-defense. Sweet “bought that home just as you buy yours, because he wanted a home to live in, to take his wife and to raise a family,” Darrow told the all-white jury. “No man lived a better life or died a better death than fighting for his home and his children.” At the end of his speech, James Weldon Johnson, the NAACP’s leader, embraced the aged lawyer and wept with him there in the courtroom. A few weeks later, Darrow was staggered by a heart attack. He was never the same. He had been, said Steffens, “the attorney for the damned.” Ultimately, I forgave him. John A. Farrell has written Clarence Darrow: Attorney for the Damned.
3a220df6bd0ffeaeec9d88da58a3d40e
https://www.smithsonianmag.com/history/climate-wars-conflicts-collapses-spurred-climate-change-180952862/
Five Conflicts and Collapses That May Have Been Spurred by Climate Change
Five Conflicts and Collapses That May Have Been Spurred by Climate Change Is climate change a matter of national security? In a warming world, sea-level rise, drought and soil degradation are putting basic human needs such as food and shelter at risk. In March, the U.S. Department of Defense called climate change a "threat multiplier," saying that competition for resources "will aggravate stressors abroad such as poverty, environmental degradation, political instability and social tensions—conditions that can enable terrorist activity and other forms of violence." Connecting climate change to a global increase in violence is tricky, and attempts to make such a link receive a fair amount of criticism. A hotter planet doesn't automatically become a more conflict-ridden one. The 2000s, for instance, saw some of the highest global temperatures in recorded history—and some of the lowest rates of civil conflict since the 1970s. But there are historical examples of civilizations that did not fare well when faced with drastic environmental change, and those examples may offer a window into the future—and even help prevent catastrophe. "We can never know with 100-percent certainty that the climate was the decisive factor [in a conflict]," says Solomon Hsiang, assistant professor of public policy at the University of California, Berkeley. "But there's a lot of cases where things look pretty conspicuous." Around 2350 B.C., the Akkadian empire conquered and united the various city-states of Sumer in Mesopotamia. For almost two centuries, this powerful empire stretched from the Mediterranean Sea to what is now inner Iran, setting up vast stretches of agricultural land and trade routes. Then, around 2100 B.C., the empire collapsed, and the land remained unsettled for nearly 300 years. Archaeologists attributed the empire's abrupt end to invasions and political strife. But in one region, formerly the center of the empire’s grain production, the soil also held an intriguing clue: a thin layer of volcanic ash covered by a thicker layer of wind-blown silts. That region, it seemed, suffered from a sudden shift to more arid conditions. In 2000, an international group of scientists studied marine sediment cores taken from the Gulf of Oman, more than 1,000 miles from what would have been the heart of the Akkadian empire. From these cores, the scientists were able to create a holistic picture of climate in the region. They found distinct peaks of the minerals calcite and dolomite beginning around 2025 B.C. that lasted approximately 300 years. These minerals are transported to the ocean as dust from dry, arid regions, so their abundance suggests that the collapse of the Akkadian empire must have been caused, at least in part, by a rapid and unprecedented drying, which in turn led to mass migrations, overcrowded cities and eventually, internal violence within the empire. Natasha Geiling is an online reporter for Smithsonian magazine.
6aa42fba672fdeb68275c528a08ff3fd
https://www.smithsonianmag.com/history/comets-tale-63573615/
Comet’s Tale
Comet’s Tale The throng of spectators, including famed airplane designer Sir Geoffrsey de Havilland, heard the earsplitting shriek before they saw the sleek, bullet-shaped aircraft burst out of the mist and hurtle down the runway at LondonAirport. The Comet 1 airliner roared into the air— and into history—on 20,000 pounds of thrust from its four De Havilland Ghost jet engines. For the first time ever, a jet-propelled aircraft was carrying passengers over a scheduled commercial route. It was Saturday, May 2, 1952. On board were 36 passengers, six crew members and 30 bags of mail. At the Comet’s controls, British Overseas Airways Capt. Michael Majendie headed the jet toward Rome, the first of five stops on the 6,724-mile journey to Johannesburg, South Africa. The plane smoothly accelerated to a cruising altitude of 35,000 feet and a speed of 460 miles per hour, more than 100 miles per hour faster than the fastest propeller-driven airliner. Suddenly, the world was a smaller place. Less than 24 hours later, thousands more onlookers ringed Johannesburg’s PalmieterfonteinAirport as the Comet 1—registration G-ALYP, dubbed “Yoke Peter,” from the phonetic alphabet then in use in Britain (George-Able- Love-Yoke-Peter)—streaked into view. Capt. R. C. Alabaster, now 84, who flew the last three legs of the flight from Khartoum, remembers the scene vividly. “Oddly enough, as we circled the airport we could see all these cars and people blocking the roads, and we thought it just must be busy. It wasn’t until after we landed that we learned they had come to see us.” Comet flight engineer Alan Johnson, now 83, who had flown many test flights, says, “This trip was the hardest because we had to make sure we got into Jo’burg on time and out the next day. By then I was quite used to crowds wherever we flew.” Though aubrey cookman, an editor at popular Mechanics magazine, found the plane noisier than he had expected, he told reporters that his only regret was that the United States wouldn’t have anything like the Comet for several years. He was right: the British were far ahead of the United States in the development of passenger jets. The revolutionary planes could be traced to World War II, when a group of visionaries, led by Lord Brabazon of Tara (often called the father of British aviation), convened to study Great Britain’s postwar position in commercial aviation. The committee was haunted by the knowledge that by 1939, the American twin-engine Douglas DC-3 was carrying a staggering 90 percent of the world’s airline passengers. America ruled the skies and looked poised to continue to do so. In the war years, the much bigger and faster Douglas DC-4 and the Lockheed Constellation 649 took to the air, ready to jump into commercial service as soon as the war ended. Brabazon’s group knew that the noise and vibration of propeller-driven planes were significant fatigue factors for passengers on long-distance flights, as four behemoth 18- cylinder engines responded to thousands of gasoline-fueled explosions per minute. Such engines required complex supercharging— forced compression of air into the cylinders— to cruise efficiently at high altitudes, above bumpy and hazardous weather. Though the big piston engines werecrafted with skill and precision, they simply could not be made to run smoothly, nor could they be easily made more powerful than they already were. The committee was also aware that jet engines, invented independently before the war by both English and German experimenters, were virtually vibration-free. Furthermore, jets were at home at high speeds and high altitude. If the British could parlay their lead in jet-engine technology into a new airliner, they might be able to break America’s choke hold on commercial airline sales. By war’s end, only one British manufacturer—De Havilland— had built a jet engine and designed a plane for it. With the blessing of Britain’s Ministry of Supply and working under a cloak of secrecy, Sir Geoffrey accepted the challenge of creating a commercial jet airliner. A major problem for the designers was fuel consumption, which was at least three times greater for jets than for piston engines, especially at low altitudes. Kerosene was the fuel, and 1945-vintage turbojet engines consumed it three to four times as fast at 10,000 feet as at 30,000. Sir Geoffrey reasoned that a plane could fly more efficiently at 35,000 feet, where the air was thinner and less power would be required for propulsion. Such high-flying planes, though, would need a pressurized cabin to allow passengers to breathe without oxygen masks. Pressurization would mean that as the airliner climbed to its cruising altitude nearly seven miles above the earth, the cabin would have to be pumped with air until its interior pressure exceeded the pressure outside the fuselage by about five pounds per square inch. As the plane descended to land, cabin pressure would have to be bled off again. Each cycle would put enormous stress on the plane’s structure; the tubular cabin would stretch slightly when pressurized, then contract as pressure was released. Just three years after full-fledged design work commenced, De Havilland chief test pilot John Cunningham lifted the Comet off the ground for the first time and pronounced the plane “Very promising. Very quick.” Joining him as test pilots were Michael Majendie and Ernest Rodley, now 87, who became the world’s first certified commercial jet pilot. “I was able to get down to the Ministry of Aviation in London to get my license endorsed first,” says Rodley. “That’s the only reason I’ve achieved fame.” Of Majendie, an expert in flight planning, he says, “He was the brains, and I was the experience. Together we made quite a little team.” The British Overseas Airways Corporation ordered eight of the airliners, and as word spread, other airlines came knocking on De Havilland’s door. Only one U.S. carrier, Pan Am, placed an order, for three larger, longer-range Comet 3s, which were still on the drawing board. For the most part, the American airline industry—then highly profitable with its existing propeller-driven fleets—had little interest in spending huge amounts of money for untried, fuel-guzzling jets. In only its first year, the Comet flew 104.6 million miles, carrying 28,000 passengers. Then, on October 26, 1952, a Comet leaving Rome ran off the runway and skidded to a halt with a broken landing gear. The 35 passengers and eight crew members survived. Five months later, a Canadian Pacific Comet bound from London to Sydney crashed on takeoff at Karachi, Pakistan, and burned, killing all 11 passengers and the crew. An investigation revealed a flaw in wing configuration. Revised pilot instructions and a change in the wings’ leading edges solved the problem. Then, two months later, a year to the day after the inaugural flight, a BOAC Comet with 43 passengers and crew disintegrated at 10,000 feet after leaving Calcutta in a heavy thunderstorm. Eight months after that, on January 10, 1954, something went terribly wrong at 26,000 feet on a BOAC flight a few minutes out of Rome. “I heard a roar, very high,” police quoted one eyewitness as telling them. “Then there was a series of blasts. The next thing I saw was a streak of smoke plunging perpendicularly into the sea.” The plane, the inaugural Yoke Peter, carried 29 passengers and a crew of six. The next day, BOAC grounded all Comet flights. “Initially, we didn’t think it could be mechanical breakup,” says Captain Alabaster. “We had every confidence in the airplane.” Adds Ernest Rodley: “It was a perfect airplane as far as we were concerned. We were absolutely puzzled by the problems.” The Ministry of Civil Aviation launched the largest aircraft accident investigation ever undertaken at the time, and the British Admiralty started a salvage operation— no easy task, given that the plane had gone down in 500 feet of water. Within a month, the navy had brought up a big section of Yoke Peter’s tail, along with skin from the fuselage and miscellaneous other parts. The wreckage was taken to the Royal Aircraft Establishment at Farnborough, England, for scrutiny by scientists and engineers. After investigators concluded that “there appeared to be no justification for placing special restrictions on the Comet aircraft,” the planes began flying again. Public confidence remained high; every seat on the first resumed flight was filled. But on April 8, even as Yoke Peter’s remains were still being assembled at Farnborough, a South African Airways Comet on a flight from Rome to Cairo lost radio contact at 35,500 feet and fell into the Mediterranean. Fourteen passengers and seven crew members were lost. Comets were immediately grounded for the second time in three months. Prime Minister Winston Churchill now intervened. “The cost of solving the Comet mystery must be reckoned in neither money nor manpower,” he declared. At stake were no less than the credibility of the British aircraft industry and the viability of jet aircraft worldwide. Yoke Peter’s reassembled pieces pointed to metal fatigue. But why? Pressurization was the leading suspect. Says Captain Rodley, who took part in the inquiry: “No one had taken into consideration the pressurizing cycles on the fuselage for a given time span, which were faster than the equivalent cycles in the slower, propeller-driven airplanes.” To gauge the effect of these cycles, an entire Comet fuselage was placed in a giant water tank, and its sealed interior filled with water. To simulate cabin-pressure changes in an aircraft climbing to 35,000 feet and then descending again, interior pressure was increased and decreased at three-minute intervals. Around-the-clock testing aged the Comet nearly 40 times faster than actual service. In the meantime, autopsy reports from the Italian pathologist who examined the bodies of victims of one of the crashes indicated they had died “by violent movement and explosive decompression.” Evidence pointed to the catastrophic failure of the fuselage. The final clue, revealing the weakness in the Comet’s structure, turned up on June 24 in the tank at Farnborough, where the immersed test Comet had been subjected to the equivalent of 9,000 flying hours. Instruments showed a sudden drop in cabin pressure, indicating that something had happened in the tank. When the drains were opened and the water flooded out, scientists stared in grim amazement. Repeated pressurization had caused the fuselage to split. One fracture started in the corner of a window atop the aircraft where radio aerials were housed and continued for eight feet, passing directly through a window frame in its path. Closer examination showed discoloration and crystallization, telltale evidence of metal fatigue. At high altitude, after many pressurization cycles, the Comets’ fuselages simply lost their ability to contain high air pressure, and the planes exploded with bomblike force. After the investigation, the Comet 1’s future was sealed. It never carried another passenger. Neither did its wouldbe successors, Comets 2 and 3. Comet 4 was four years in production, and by the time it went into service it had been overtaken by developments in the United States. Fewer than 70 were ever built for airline service. On July 15, 1954, test pilot Tex Johnston lifted the creamand- buff Boeing 367-80 (the famous “Dash-80,” now in the collection of the Smithsonian’s National Air and SpaceMuseum) off the runway at Renton, Washington. It was the first flight of what would become a new jet airliner, the Boeing 707, with more than three times the passenger capacity of the Comet 1. It would enter service in 1958, at the same time as the much smaller Comet 4. In all, eight hundred and fifty-five 707s would roll off Boeing’s assembly lines. The United States had entered the jet age, where it would maintain its dominance into the 21st century. Still, Boeing had not gotten there first. That honor went to De Havilland and the Comet, which had made a shrinking world even smaller, changing forever the way its people traveled the globe.
2a5246460b0927ee4e578bf5eddbe40d
https://www.smithsonianmag.com/history/commemorating-100-years-of-the-rv-56915006/?no-ist
Commemorating 100 Years of the RV
Commemorating 100 Years of the RV Every December 15, Kevin Ewert and Angie Kaphan celebrate a “nomadiversary,” the anniversary of wedding their lives to their wanderlust. They sit down at home, wherever they are, and decide whether to spend another year motoring in their 40-foot recreational vehicle. Their romance with the road began six years ago, when they bought an RV to go to Burning Man, the annual temporary community of alternative culture in the Nevada desert. They soon started taking weekend trips and, after trading up to a bigger RV, motored from San Jose to Denver and then up to Mount Rushmore, Deadwood, Sturgis, Devil’s Tower and through Yellowstone. They loved the adventure, and Ewert, who builds web applications, was able to maintain regular work hours, just as he’d done at home in San Jose. So they sold everything, including their home in San Jose, where they’d met, bought an even bigger RV, and hit the road full time, modern-day nomads in a high-tech covered wagon. “What we’re doing with the RV is blazing our own trail and getting out there and seeing all these places,” Ewert says. “I think it’s a very iconic American thing.” The recreational vehicle turns 100 years old this year. According to the Recreational Vehicle Industry Association, about 8.2 million households now own RVs. They travel for 26 days and an average of 4,500 miles annually, according to a 2005 University of Michigan study. The institute estimates about 450,000 of them are full-time RVers like Ewert and Kaphan. Drivers began making camping alterations to cars almost as soon as they were introduced. The first RV was Pierce-Arrow’s Touring Landau, which debuted at Madison Square Garden in 1910. The Landau had a back seat that folded into a bed, a chamber pot toilet and a sink that folded down from the back of the seat of the chauffeur, who was connected to his passengers via telephone. Camping trailers made by Los Angeles Trailer Works and Auto-Kamp Trailers also rolled off the assembly line beginning in 1910. Soon, dozens of manufacturers were producing what were then called auto campers, according to Al Hesselbart, the historian at the RV Museum and Hall of Fame in Elkhart, Indiana, a city that produces 60 percent of the RVs manufactured in the United States today. As automobiles became more reliable, people traveled more and more. The rise in popularity of the national parks attracted travelers who demanded more campsites. David Woodworth—a former Baptist preacher who once owned 50 RVs built between 1914 and 1937, but sold many of them to the RV Museum—says in 1922 you could visit a campground in Denver that had 800 campsites, a nine-hole golf course, a hair salon and a movie theater. The Tin Can Tourists, named because they heated tin cans of food on gasoline stoves by the roadside, formed the first camping club in the United States, holding their inaugural rally in Florida in 1919 and growing to 150,000 members by the mid-1930s. They had an initiation; an official song, “The More We Get Together;” and a secret handshake. Another group of famous men, the self-styled Vagabonds—Thomas Edison, Henry Ford, Harvey Firestone and naturalist John Burroughs—caravaned in cars for annual camping trips from 1913 to 1924, drawing national attention. Their trips were widely covered by the media and evoked a desire in others to go car camping (regular folks certainly didn’t have their means). They brought with them a custom Lincoln truck outfitted as a camp kitchen. While they slept in tents, their widely chronicled adventures helped promote car camping and the RV lifestyle. Later, CBS News correspondent Charles Kuralt captured the romance of life on the road with reports that started in 1967, wearing out motor homes by covering more than a million miles over the next 25 years in his “On the Road” series. “There’s just something about taking your home with you, stopping wherever you want to and being in the comfort of your own home, being able to cook your own meals, that has really appealed to people,” Woodworth says. The crash of 1929 and the Depression dampened the popularity of RVs, although some people used travel trailers, which could be purchased for $500 to $1,000, as inexpensive homes. Rationing during World War II stopped production of RVs for consumer use, although some companies converted to wartime manufacturing, making units that served as mobile hospitals, prisoner transports and morgues. After the war, the returning GIs and their young families craved inexpensive ways to vacation. The burgeoning interstate highway system offered a way to go far fast and that combination spurred a second RV boom that lasted through the 1960s. Motorized RVs started to become popular in the late 1950s, but they were expensive luxury items that were far less popular than trailers. That changed in 1967 when Winnebago began mass-producing what it advertised as “America’s first family of motor homes,” five models from 16 to 27 feet long, which sold for as little as $5,000. By then, refrigeration was a staple of RVs, according to Hesselbart, who wrote The Dumb Things Sold Just Like That, a history of the RV industry. “The evolution of the RV has pretty much followed technology,” Woodworth says. “RVs have always been as comfortable as they can be for the time period.” As RVs became more sophisticated, Hesselbart says, they attracted a new breed of enthusiasts interested less in camping and more in destinations, like Disney World and Branson, Missouri. Today, it seems that only your budget limits the comforts of an RV. Modern motor homes have convection ovens, microwaves, garbage disposals, washers and dryers, king-size beds, heated baths and showers and, of course, satellite dishes. “RVs have changed, but the reason people RV has been constant the whole time,” Woodworth says. “You can stop right where you are and be at home.” Ewert chose an RV that features an office. It’s a simple life, he says. Everything they own travels with them. They consume less and use fewer resources than they did living in a house, even though the gas guzzlers get only eight miles a gallon. They have a strict flip-flops and shorts dress code. They’ve fallen in love with places like Moab and discovered the joys of southern California after being northern California snobs for so long. And they don’t miss having a house somewhere to anchor them. They may not be able to afford a house in Malibu down the street from Cher’s place, but they can afford to camp there with a million-dollar view out their windows. They’ve developed a network of friends on the road and created NuRvers.com, a Web site for younger RV full-timers (Ewert is 47; Kaphan is 38). Asked about their discussion on the next December 15, Ewert says he expects they’ll make the same choice they have made the past three years—to stay on the road. “We’re both just really happy with what we’re doing,” he says. “We’re evangelical about this lifestyle because it offers so many new and exciting things.” Jim Morrison is a freelance writer whose stories, reported from two dozen countries, have appeared in numerous publications including Smithsonian.com, the New York Times, and National Wildlife.
42525ba379c32adcc895a69f751351a9
https://www.smithsonianmag.com/history/conspiracy-theories-abounded-19th-century-american-politics-180971940/
Conspiracy Theories Abounded in 19th-Century American Politics
Conspiracy Theories Abounded in 19th-Century American Politics From claims that NASA faked the moon landing to suspicions about the U.S. government’s complicity in the assassination of John F. Kennedy, Americans love conspiracy theories. Conspiratorial rhetoric in presidential campaigns and its distracting impact on the body politic have been a fixture in American elections from the beginning, but conspiracies flourished in the 1820s and 1830s, when modern-day American political parties developed, and the expansion of white male suffrage increased the nation’s voting base. These new parties, which included the Democrats, the National Republicans, the Anti-Masons, and the Whigs, frequently used conspiracy accusations as a political tool to capture new voters—ultimately bringing about a recession and a collapse of public trust in the democratic process. During the early decades of the American republic, the Federalist and Jeffersonian Republican Parties engaged in conspiratorial rhetoric on a regular basis. Following the War of 1812, the Federalist Party faded from the political landscape, leaving the Republicans as the predominant national party. Their hold was so great that in 1816 and 1820, James Monroe, the Republican presidential candidate, ran virtually unopposed, but in 1824, the Republicans splintered into multiple and disparate factions. Five viable candidates ran in that election cycle, and John Quincy Adams won the presidency. The controversy around Adams’s victory quickly fueled suspicions: Tennessean Andrew Jackson had won the most electoral and popular votes and the most regions and states, but because he did not win the majority of electoral votes, the U.S. House of Representatives was constitutionally required to choose the president in a runoff of the top three vote-getters. Jackson’s supporters believed that House Speaker Henry Clay, who had placed fourth in the regular election, helped Adams win the House election in return for being appointed secretary of state. The Jacksonians’ charges of a “corrupt bargain” between Adams and Clay ensured that the 1828 election would, in part, be fought over this conspiracy theory. Drawing on period newspapers, diaries, memoirs, and public and private correspondence, The Coming of Democracy is the first book-length treatment to reveal how presidents and presidential candidates used both old and new forms of cultural politics to woo voters and win elections in the Jacksonian era. During the hotly contested 1828 campaign, Jackson’s opponents, too, trafficked in conspiracy theories: In particular, administration men accused Jackson’s supporters of plotting a coup d’état if their candidate lost to President Adams. This “theory” held that pro-Jackson congressmen, upset about the national government’s attempts to impose a new tariff on imports, held “secret meetings” to discuss “the dissolution of the Union.” One pro-Jackson supporter “declared that he should not be astonished to see Gen. Jackson, if not elected, placed in the Presidential Chair, at the point of fifty thousand bayonets!!!” The thought of a national military hero such as Jackson leading a military rebellion had no basis in reality, but the conspiracy theory fit the tenor of the times. Jackson won—and conspiratorial rhetoric remained ever-present throughout his presidency. In the run-up to the 1832 election, the national organization of Freemasonry drew conspiracy theorists’ attention. Spurred on by the murder of a New York Mason named William Morgan, who had threatened to disclose the fraternal order’s secrets, an Anti-Masonic political party had emerged during the 1828 election. Frequently repeated accusations that Freemasonry was secretive and elitist reflected larger concerns about the ways in which the ruling elite undermined the nation’s democratic institutions through corruption. And for the Anti-Masons, Jackson was no better than Adams; in their view, the Tennessean’s promise of “rotation of office” was simply cronyism. Four years later, the Anti-Masons had gained enough supporters to run William Wirt for president against the Democratic incumbent Jackson and the National Republican candidate Henry Clay. During the 1832 campaign, they accused Freemasons of a number of transgressions beyond Morgan’s murder, including subversion of free speech and democracy. Rhode Island Anti-Masons, for example, warned that Freemasons were “darkening the public mind” by attempting to quash public criticism of their organization in the state’s newspapers. Vermont’s William Strong charged the Democrats with following the Masonic dogma of “the end justifies the means” to elect Jackson in 1828 and secure government patronage for party members. But in that same election of 1832, Anti-Masons themselves became the target of conspiracy theorists. New York Democrats saw a plot afoot in the coalition of the Anti-Masonic Party and the National Republicans in their state. How was it possible, one New York newspaper asked, that the Anti-Masons had nominated Wirt, yet had allied themselves with Clay? It was not because of principled opposition to Freemasonry, as all three presidential candidates were Masons. The only answer was that it was a “deep laid conspiracy to defeat the wishes of the people” to elect Andrew Jackson. During Jackson’s second term, much of the conspiratorial rhetoric centered on the Bank War, the political battle between the president and the Second Bank of the United States, the nation’s chief financial institution, which held both government and private funds and was supposed to remain non-partisan in its loans. Jackson, however, believed that the bank’s president Nicholas Biddle had used the institution’s deposits and influence to assist John Quincy Adams in the 1828 election. If true, this was a blatant misuse of the people’s money. Consequently, Jackson exerted his power as chief executive to remove government funds from the Second Bank, which would cripple its financial power. In retaliation, Biddle began calling in the bank’s loans across the country, precipitating a financial recession to pressure the president to restore the government’s deposits. As a result, accusations of conspiracy flew on both sides. The anti-Jackson Whig Party (which had replaced the National Republican Party of the 1832 campaign) accused Vice President Martin Van Buren of being “at the bottom of all this hostility to the Bank.” Allegedly, the “Little Magician” was using his “arts and tricks” against the Second Bank to further his presidential prospects in 1836. Democrats then responded by constructing their own conspiracy theory about “the Boston Aristocracy” and its control of the Second Bank. Stretching back to the early days of the republic, they claimed this “nefarious conspiracy” had used the Second Bank to target the anti-aristocratic Southern and mid-Atlantic states, “producing universal panic and distress” by constricting the money supply in those regions. These same conspirators, according to Democrats, were now employing “the whole power of the present Bank to embarrass the administration and distress the country,” not to mention hurting the Democratic Party’s chances of retaining the White House. In the 1836 presidential campaign, which pitted Van Buren against three Whig candidates—William Henry Harrison, Daniel Webster, and Hugh Lawson White—the Whigs used conspiracy theories in an attempt to derail the Democrats’ chances for a political victory. They accused Van Buren of being a member of the Catholic Church and of participating in a “popish plot” intended “to conciliate the Catholics, in the U States for Political purposes.” Van Buren, who was raised in the Dutch Reformed Church, denied the accusation. Whigs also accused Democratic vice-presidential candidate Richard M. Johnson of wanting to force Washington society to accept his two daughters, who were the product of his relationship with an enslaved African-American woman. According to one Richmond Whig, Johnson’s “depraved tastes” threatened to destroy the racial barrier that kept African-Americans in a subordinate position, and endangered “the purity of our maidens, the chaste dignity of our matrons.” Van Buren and Johnson won in 1836, but Johnson’s family circumstances continued to plague his political career and harmed Van Buren’s standing with some Southern voters in 1840. It is difficult to pinpoint exactly how many votes changed because of conspiratorial rhetoric, either then or now. It seems clear, though, that American politicians believe that this type of rhetoric makes a difference—and that American voters have always had to be politically literate to determine the difference between conspiracy theories and actual conspiracies. This enduring belief in vast, unexplainable conspiracies has often contributed to voters’ feelings of powerlessness, increasing their cynicism and apathy. And of course, conspiratorial rhetoric undermines the nation’s democratic institutions and practices. Politically motivated conspiracy theories, ultimately, bring the same result as conspiracies themselves: a small number of elite Americans wielding immense power over the future of the United States, power that may not account for the will of the majority. Mark R. Cheathem is professor of history and project director of the Papers of Martin Van Buren at Cumberland University. He is the author of The Coming of Democracy: Presidential Campaigning in the Age of Jackson. This essay is part of What It Means to Be American, a project of the Smithsonian’s National Museum of American History and Arizona State University, produced by Zócalo Public Square.
6ac3fc26ea0a80efdf4520f21e2bf561
https://www.smithsonianmag.com/history/costs-confederacy-special-report-180970731/?fbclid=IwAR19OZJGeMwp94pmq-PzPHt-rbUubQKBxk0pvhuOHHOcCkB0BaCl4fDL4n8
With centuries-old trees, manicured lawns, a tidy cemetery and a babbling brook, the Jefferson Davis Home and Presidential Library is a marvelously peaceful, green oasis amid the garish casinos, T-shirt shops and other tourist traps on Highway 90 in Biloxi, Mississippi. One gray October morning, about 650 local schoolchildren on a field trip to Beauvoir, as the home is called, poured out of buses in the parking lot. A few ran to the yard in front of the main building to explore the sprawling live oak whose lower limbs reach across the lawn like massive arms. In the gift shop they perused Confederate memorabilia—mugs, shirts, caps and sundry items, many emblazoned with the battle flag of the Army of Northern Virginia. It was a big annual event called Fall Muster, so the field behind the library was teeming with re-enactors cast as Confederate soldiers, sutlers and camp followers. A group of fourth graders from D’Iberville, a quarter of them black, crowded around a table heaped with 19th-century military gear. Binoculars. Satchels. Bayonets. Rifles. A portly white man, sweating profusely in his Confederate uniform, loaded a musket and fired, to oohs and aahs. A woman in a white floor-length dress decorated with purple flowers gathered a group of older tourists on the porch of the “library cottage,” where Davis, by then a living symbol of defiance, retreated in 1877 to write his memoir, The Rise and Fall of the Confederate Government. After a discussion of the window treatments and oil paintings, the other visitors left, and we asked the guide what she could tell us about slavery. Sometimes children ask about it, she said. “I want to tell them the honest truth, that slavery was good and bad.” While there were some “hateful slave owners,” she said, “it was good for the people that didn’t know how to take care of themselves, and they needed a job, and you had good slave owners like Jefferson Davis, who took care of his slaves and treated them like family. He loved them.” The subject resurfaced the next day, before a mock battle, when Jefferson Davis—a re-enactor named J.W. Binion—addressed the crowd. “We were all Americans and we fought a war that could have been prevented,” Binion declared. “And it wasn’t fought over slavery, by the way!” Then cannons boomed, muskets cracked, men fell. The Confederates beat back the Federals. An honor guard in gray fired a deafening volley. It may have been a scripted victory for the Rebels, but it was a genuine triumph for the racist ideology known as the Lost Cause—a triumph made possible by taxpayer money. We went to Beauvoir, the nation’s grandest Confederate shrine, and to similar sites across the Old South, in the midst of the great debate raging in America over public monuments to the Confederate past. That controversy has erupted angrily, sometimes violently, in Virginia, North Carolina, Louisiana and Texas. The acrimony is unlikely to end soon. While authorities in a number of cities—Baltimore, Memphis, New Orleans, among others—have responded by removing Confederate monuments, roughly 700 remain across the South. To address this explosive issue in a new way, we spent months investigating the history and financing of Confederate monuments and sites. Our findings directly contradict the most common justifications for continuing to preserve and sustain these memorials. First, far from simply being markers of historic events and people, as proponents argue, these memorials were created and funded by Jim Crow governments to pay homage to a slave-owning society and to serve as blunt assertions of dominance over African-Americans. Second, contrary to the claim that today’s objections to the monuments are merely the product of contemporary political correctness, they were actively opposed at the time, often by African-Americans, as instruments of white power. Finally, Confederate monuments aren’t just heirlooms, the artifacts of a bygone era. Instead, American taxpayers are still heavily investing in these tributes today. We have found that, over the past ten years, taxpayers have directed at least $40 million to Confederate monuments—statues, homes, parks, museums, libraries and cemeteries—and to Confederate heritage organizations. For our investigation, the most extensive effort to capture the scope of public spending on Confederate memorials and organizations, we submitted 175 open records requests to the states of the former Confederacy, plus Missouri and Kentucky, and to federal, county and municipal authorities. We also combed through scores of nonprofit tax filings and public reports. Though we undoubtedly missed some expenditures, we have identified significant public funding for Confederate sites and groups in Mississippi, Virginia, Alabama, Georgia, Florida, Kentucky, South Carolina and Tennessee. In addition, we visited dozens of sites, to document how they represent history and, in particular, slavery: After all, the Confederacy’s founding documents make clear that the Confederacy was established to defend and perpetuate that crime against humanity. (Listen to an episode of Reveal, from The Center for Investigative Reporting, about this special reporting project.) A century and a half after the Civil War, American taxpayers are still helping to sustain the defeated Rebels’ racist doctrine, the Lost Cause. First advanced in 1866 by a Confederate partisan named Edward Pollard, it maintains that the Confederacy was based on a noble ideal, the Civil War was not about slavery, and slavery was benign. “The state is giving the stamp of approval to these Lost Cause ideas, and the money is a symbol of that approval,” Karen Cox, a historian of the American South at the University of North Carolina at Charlotte, said of our findings. “What does that say to black citizens of the state, or other citizens, or to younger generations?” The public funding of Confederate iconography is also troubling because of its deployment by white nationalists, who have rallied to support monuments in New Orleans, Richmond and Memphis. The deadly protest in Charlottesville, Virginia, in 2017, where a neo-Nazi rammed his car into counter-protesters, killing Heather Heyer, was staged to oppose the removal of a Robert E. Lee statue. In 2015, before Dylann Roof opened fire on a Bible study group at Emanuel African Methodist Episcopal Church in Charleston, South Carolina, killing nine African-Americans, he spent a day touring places associated with the subjugation of black people, including former plantations and a Confederate museum. “Confederate sites play to the white supremacist imagination,” said Heidi Beirich, who leads the Southern Poverty Law Center’s work tracking hate groups. “They are treated as sacred by white supremacists and represent what this country should be and what it would have been” if the Civil War had not been lost. * * * Like many of the sites we toured across the South, Beauvoir is privately owned and operated. Its board of directors is made up of members of the Mississippi division of the Sons of Confederate Veterans, a national organization founded in 1896 and limited to male descendants of “any veteran who served honorably in the Confederate armed forces.” The board handles the money that flows into the institution from visitors, private supporters and taxpayers. The Mississippi legislature earmarks $100,000 a year for preservation of Beauvoir. In 2014, the organization received a $48,475 grant from the Federal Emergency Management Agency for “protective measures.” As of May 2010, Beauvoir had received $17.2 million in federal and state aid related to damages caused by Hurricane Katrina in 2005. While nearly half of that money went to renovating historic structures and replacing content, more than $8.3 million funded construction of a new building that contains a museum and library. When we visited, three times since the fall of 2017, the lavishly appointed library displayed the only acknowledgment of slavery that we could find at the entire 52-acre site, though Davis had owned dozens of black men, women and children before the war: four posters, which portrayed the former slaves Robert Brown, who continued to work for the Davis family after the war, and Benjamin and Isaiah Montgomery, a father and son who were owned by Jefferson’s elder brother, Joseph. Benjamin eventually purchased two of Joseph’s plantations. The state Department of Archives and History says the money the legislature provides to Beauvoir is allocated for preservation of the building, a National Historic Landmark, not for interpretation. Beauvoir staff members told us that the facility doesn’t deal with slavery because the site’s state-mandated focus is the period Davis lived there, 1877 to 1889, after slavery was abolished. But this focus is honored only in the breach. The museum celebrates the Confederate soldier in a cavernous hall filled with battle flags, uniforms and weapons. Tour guides and re-enactors routinely denied the realities of slavery in their presentations to visitors. Fall Muster, a highlight of the Beauvoir calendar, is nothing if not a raucous salute to Confederate might. Thomas Payne, the site’s executive director until this past April, said in an interview that his goal was to make Beauvoir a “neutral educational institution.” For him, that involved countering what he referred to as “political correctness from the national media,” which holds that Southern whites are “an evil repugnant group of ignorant people who fought only to enslave other human beings.” Slavery, he said, “should be condemned. But what people need to know is that most of the people in the South were not slave owners,” and that Northerners also kept slaves. What’s more, Payne went on, “there’s actually evidence where the individual who was enslaved was better off physically and mentally and otherwise.” The notion that slavery was beneficial to slaves was notably expressed by Jefferson Davis himself, in the posthumously published memoir he wrote at Beauvoir. Enslaved Africans sent to America were “enlightened by the rays of Christianity,” he wrote, and “increased from a few unprofitable savages to millions of efficient Christian laborers. Their servile instincts rendered them contented with their lot....Never was there a happier dependence of labor and capital upon each other.” That myth, a pillar of the Lost Cause, remains a core belief of neo-Confederates, despite undeniable historic proof of slavery’s brutality. In 1850, the great abolitionist Frederick Douglass, who had escaped slavery, said, “To talk of kindness entering into a relation in which one party is robbed of wife, of children, of his hard earnings, of home, of friends, of society, of knowledge, and of all that makes this life desirable is most absurd, wicked, and preposterous.” * * * A few miles off the highway between Montgomery and Birmingham, past trailer homes and cotton fields, are the manicured grounds and arched metal gateways of Confederate Memorial Park. The state of Alabama acquired the property in 1903 as an old-age home for Confederate veterans, their wives and their widows. After the last residents died, the park closed. But in 1964, as civil rights legislation gained steam in Washington, Alabama’s all-white legislature revived the site as a “shrine to the honor of Alabama’s citizens of the Confederacy.” The day we visited, 16 men in Confederate uniforms drilled in the quiet courtyards. Two women in hoop skirts stood to the side, looking at their cellphones. Though Alabama state parks often face budget cuts—one park had to close all its campsites in 2016—Confederate Memorial Park received some $600,000 that year. In the past decade, the state has allocated more than $5.6 million to the site. The park, which in 2016 served fewer than 40,000 visitors, recently expanded, with replica Civil War barracks completed in 2017. The museum in the Alabama park attempts a history of the Civil War through the story of the common Confederate soldier, an approach that originated soon after the war and remains popular today. It is tragic that hundreds of thousands of young men died on the battlefield. But the common soldier narrative was forged as a sentimental ploy to divert attention from the scalding realities of secession and slavery—to avoid acknowledging that “there was a right side and a wrong side in the late war,” as Douglass put it in 1878. The memorial barely mentions black people. On a small piece of card stock, a short entry says “Alabama slaves became an important part of the war’s story in several different ways,” adding that some ran away or joined the Union Army, while others were conscripted to fight for the Confederacy or maintain fortifications. There is a photograph of a Confederate officer, reclining, next to an enslaved black man, also clad in a uniform, who bears an expression that can only be described as dread. Near the end of the exhibit, a lone panel states that slavery was a factor in spurring secession. These faint nods to historical fact were overpowered by a banner that spanned the front of a log cabin on state property next to the museum: “Many have been taught the war between the states was fought by the Union to eliminate Slavery. THIS VIEW IS NOT SUPPORTED BY THE HISTORICAL EVIDENCE....The Southern States Seceded Because They Resented the Northern States Using Their Numerical Advantage in Congress to Confiscate the Wealth of the South to the Advantage of the Northern States.” The state has a formal agreement with the Sons of Confederate Veterans to use the cabin as a library. Inside, books about Confederate generals and Confederate history lined the shelves. The South Was Right!, which has been called the neo-Confederate “bible,” lay on a table. The 1991 book’s co-author, Walter Kennedy, helped found the League of the South, a self-identified “Southern nationalist” organization that the Southern Poverty Law Center has classified as a hate group. “When we Southerners begin to realize the moral veracity of our cause,” the book says, “we will see it not as a ‘lost cause,’ but as the right cause, a cause worthy of the great struggle yet to come!” A spokeswoman for the Alabama Historical Commission said she could not explain how the banner on the cabin had been permitted and declined our request to interview the site’s director. Alabama laws, like those in other former Confederate states, make numerous permanent allocations to advance the memory of the Confederacy. The First White House of the Confederacy, where Jefferson Davis and his family lived at the outbreak of the Civil War, is an Italianate mansion in Montgomery adjacent to the State Capitol. The state chartered the White House Association of Alabama to run the facility, and spent $152,821 in 2017 alone on salaries and maintenance for this monument to Davis—more than $1 million over the last decade—to remind the public “for all time of how pure and great were southern statesmen and southern valor.” That language from 1923 remains on the books. * * * An hour and a half east of Atlanta by car lies Crawfordville (pop. 600), the seat of Taliaferro County, a majority black county with one of the lowest median household incomes in Georgia. A quarter of the town’s land is occupied by the handsomely groomed, 1,177-acre A.H. Stephens State Park. Since 2011 state taxpayers have given the site $1.1 million. Most of that money is spent on campsites and trails, but as with other Confederate sites that boast recreational facilities—most famously, Stone Mountain, also in Georgia—the A.H. Stephens park was established to venerate Confederate leadership. And it still does. Alexander Hamilton Stephens is well known for a profoundly racist speech he gave in Savannah in 1861 a month after becoming vice president of the provisional Confederacy. The Confederacy’s “foundations are laid, its cornerstone rests upon the great truth, that the negro is not equal to the white man; that slavery—subordination to the superior race—is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.” That speech was nowhere in evidence during our visit to the park. It wasn’t in the Confederate museum, which was erected by the United Daughters of the Confederacy with the support of the state of Georgia in 1952 and displays Confederate firearms and uniforms. It wasn’t among the printed texts authored by Stephens that are placed on tabletops in the former slave quarters for visitors to peruse. And it wasn’t in the plantation house, called Liberty Hall. Our guide, a state employee, opened the door of a small two-room cabin once occupied by Harry and Eliza—two of the 34 people Stephens held in bondage. The guide pointed to a photograph of the couple on a wall and said Stephens “kept them good, and took care of the people who worked for him.” We went on many tours of the homes of the Confederacy’s staunchest ideologues, and without exception we were told that the owners were good and the slaves were happy. After the war, Stephens spent a great deal of energy pretending he wasn’t entirely pro-slavery, and he returned to public life as a member of Congress and then as governor. Robert Bonner, a historian at Dartmouth who is at work on a biography of Stephens, said the Stephens memorial maintains the fraud: “The story at Liberty Hall is a direct version of the story Stephens fabricated about himself after the war.” Half an hour away is the home of Robert Toombs, the Confederacy’s secretary of state and Stephens’ close friend. His house has been recently restored, with state as well as private funds, and Wilkes County has taken over daily operations. In a ground-floor gallery, posters in gilt frames hang below banners that announce the four acts of Toombs’ life: “The Formative Years,” “The Baron of Wilkes County,” “The Premier of the Confederacy” and “Without a Country.” About slavery, nothing. When asked about that, the docent, a young volunteer, retrieved a binder containing a Works Progress Administration oral history given by Alonza Fantroy Toombs. It begins, “I’se the proudest nigger in de worl’, caze I was a slave belonging to Marse Robert Toombs of Georgia; de grandest man dat ever lived, next to Jesus Christ.” A more revealing, well-documented story is that of Garland H. White, an enslaved man who escaped Toombs’ ownership just before the Civil War and fled to Ontario. After the war erupted he heroically risked his freedom to join the United States Colored Troops. He served as an Army chaplain and traveled to recruit African-American soldiers. We found no mention at the Toombs memorial of White’s experience. In fact, we know of no monument to White in all of Georgia. An average of $18,000 in county monies each year since 2011, plus $80,000 in state renovation funds in 2017 alone, have been devoted to this memorial to Toombs, who refused to take the oath of allegiance to the United States after the war and fled to Cuba and France to avoid arrest. Upon his return to Georgia, Toombs labored to circumscribe the freedom of African-Americans. “Give us a convention,” Toombs said in 1876, “and I will fix it so that the people shall rule and the Negro shall never be heard from.” The following year he got that convention, which passed a poll tax and other measures to disenfranchise black men. * * * It’s difficult to imagine that all the Confederate monuments and historic sites dotting the landscape today would have been established if African-Americans had had a say in the matter. Historically, the installation of Confederate monuments went hand in hand with the disenfranchisement of black people. The historical record suggests that monument-building peaked during three pivotal periods: from the late 1880s into the 1890s, as Reconstruction was being crushed; from the 1900s through the 1920s, with the rise of the second Ku Klux Klan, the increase in lynching and the codification of Jim Crow; and in the 1950s and 1960s, around the centennial of the war but also in reaction to advances in civil rights. An observation by the Yale historian David Blight, describing a “Jim Crow reunion” at Gettysburg, captures the spirit of Confederate monument-building, when “white supremacy might be said to have been the silent, invisible, master of ceremonies.” Yet courageous black leaders did speak out, right from the start. In 1870, Douglass wrote, “Monuments to the ‘lost cause’ will prove monuments of folly ... in the memories of a wicked rebellion which they must necessarily perpetuate...It is a needless record of stupidity and wrong.” In 1931, W.E.B. Du Bois criticized even simple statues erected to honor Confederate leaders. “The plain truth of the matter,” Du Bois wrote, “would be an inscription something like this: ‘sacred to the memory of those who fought to Perpetuate Human Slavery.’” In 1966, Martin Luther King Jr. joined a voting rights rally in Grenada, Mississippi, at the Jefferson Davis monument, where, earlier that day, an organizer named Robert Green declared, “We want brother Jefferson Davis to know the Mississippi he represented, the South he represented, will never stand again.” In today’s debates about the public display of Confederate symbols, the strong objections of early African-American critics are seldom remembered, perhaps because they had no impact on (white) officeholders at the time. But the urgent black protests of the past now have the ring of prophecy. John Mitchell Jr., an African-American, was a journalist and a member of Richmond’s city council during Reconstruction. Like his friend and colleague Ida B. Wells, Mitchell was born into slavery, and spent much of his career documenting lynchings and campaigning against them; also like Wells, he was personally threatened with lynching. Arguing fiercely against spending public money to memorialize the Confederacy, Mitchell took aim at the movement to erect a grand Robert E. Lee statue, and tried to block funding for the proposed statue’s dedication ceremony. But a white conservative majority steamrolled Mitchell and the two other black council members, and the Lee statue was unveiled on May 29, 1890. Gov. Fitzhugh Lee, a nephew of Lee and a former Confederate general himself, was president of the Lee Monument Association, which executed the project. Virginia issued bonds to support its construction. The city of Richmond funded Dedication Day events, attended by some 150,000 people. Mitchell covered the celebration for the Richmond Planet, the paper he edited. “This glorification of States Rights Doctrine—the right of secession, and the honoring of men who represented that cause,” he wrote, “fosters in the Republic, the spirit of Rebellion and will ultimately result in the handing down to generations unborn a legacy of treason and blood.” In the past decade, Virginia has spent $174,000 to maintain the Lee statue, which has become a lightning rod for the larger controversy. In 2017, Richmond police spent some $500,000 to guard the monument and keep the peace during a neo-Confederate protest there. * * * In 1902, several years after nearly every African-American elected official was driven from office in Virginia, and as blacks were being systematically purged from voter rolls, the state’s all-white legislature established an annual allocation for the care of Confederate graves. Over time, we found, that spending has totaled roughly $9 million in today’s dollars. Treating the graves of Confederate soldiers with dignity might not seem like a controversial endeavor. But the state has refused to extend the same dignity to the African-American men and women whom the Confederacy fought to keep enslaved. Black lawmakers have long pointed out this blatant inequity. In 2017, the legislature finally passed the Historical African American Cemeteries and Graves Act, which is meant to address the injustice. Still, less than $1,000 has been spent so far, and while a century of investment has kept Confederate cemeteries in rather pristine condition, many grave sites of the formerly enslaved and their descendants are overgrown and in ruins. Significantly, Virginia disburses public funding for Confederate graves directly to the United Daughters of the Confederacy, which distributes it to, among others, local chapters of the UDC and the Sons of Confederate Veterans. Since 2009, Virginia taxpayers have sent more than $800,000 to the UDC. The UDC, a women’s Confederate heritage group with thousands of members in 18 states and the District of Columbia, is arguably the leading advocate for Confederate memorials, and it has a history of racist propagandizing. One of the organization’s most influential figures was Mildred Lewis Rutherford, of Athens, Georgia, a well-known speaker and writer at the turn of the 20th century and the UDC’s historian general from 1911 to 1916. Rutherford was so devoted to restoring the racial hierarchies of the past that she traveled the country in full plantation regalia spreading the “true history,” she called it, which cast slave owners and Klansmen as heroes. She pressured public schools and libraries across the South to accept materials that advanced Lost Cause mythology, including pro-Klan literature that referred to black people as “ignorant and brutal.” At the center of her crusade was the belief that slaves had been “the happiest set of people on the face of the globe,” “well-fed, well-clothed, and well-housed.” She excoriated the Freedmen’s Bureau, a federal agency charged with protecting the rights of African-Americans, and argued that emancipation had unleashed such violence by African-Americans that “the Ku Klux Klan was necessary to protect the white woman.” UDC officials did not respond to our interview requests. Previously, though, the organization has disavowed any links to hate groups, and in 2017 the president-general, Patricia Bryson, released a statement saying the UDC “totally denounces any individual or group that promotes racial divisiveness or white supremacy.” Confederate cemeteries in Virginia that receive taxpayer funds handled by the UDC are nonetheless used as gathering places for groups with extreme views. One afternoon last May, we attended the Confederate Memorial Day ceremony in the Confederate section of the vast Oakwood Cemetery in Richmond. We were greeted by members of the Sons of Confederate Veterans and the Virginia Flaggers, a group that says its mission is to “stand AGAINST those who would desecrate our Confederate Monuments and memorials, and FOR our Confederate Veterans.” An honor guard of re-enactors presented an array of Confederate standards. Participants stood at attention for an invocation read by a chaplain in period dress. They put their hands on their hearts, in salute to the Confederate flag. Susan Hathaway, a member of the Virginia Flaggers, led the crowd of several dozen in a song that was once the official paean to the Commonwealth: Carry me back to old Virginny, There’s where the cotton and the corn and taters grow, There’s where the birds warble sweet in the springtime, There’s where this old darkey’s heart am long’d to go. * * * “Very little has been done to address the legacy of slavery and its meaning in contemporary life.” That scathing assessment of the nation’s unwillingness to face the truth was issued recently by the Equal Justice Initiative, the Montgomery-based legal advocacy group that in April 2018 opened the first national memorial to victims of lynching. A few Confederate historical sites, though, are showing signs of change. In Richmond, the American Civil War Center and the Museum of the Confederacy have joined forces to become the American Civil War Museum, now led by an African-American CEO, Christy Coleman. The new entity, she said, seeks to tell the story of the Civil War from multiple perspectives—the Union and the Confederacy, free and enslaved African-Americans—and to take on the distortions and omissions of Confederate ideology. “For a very, very long time” the Lost Cause has dominated public histories of the Civil War, Coleman told us in an interview. “Once it was framed, it became the course for everything. It was the accepted narrative.” In a stark comparison, she noted that statues of Hitler and Goebbels aren’t scattered throughout Germany, and that while Nazi concentration camps have been made into museums, “they don’t pretend that they were less horrible than they actually were. And yet we do that to America’s concentration camps. We call them plantations, and we talk about how grand everything was, and we talk about the pretty dresses that women wore, and we talk about the wealth, and we refer to the enslaved population as servants as if this is some benign institution.” Stratford Hall, the Virginia plantation where Robert E. Lee was born, also has new leadership. Kelley Deetz, a historian and archaeologist who co-edited a paper titled “Historic Black Lives Matter: Archaeology as Activism in the 21st Century,” was hired in June as the site’s first director of programming and education. Stratford Hall, where 31 people were enslaved as of 1860, is revising how it presents slavery. The recent shocking violence in Charlottesville, Deetz said, was speeding up “the slow pace of dealing with these kinds of sensitive subjects.” She said, “I guarantee you that in a year or less, you go on a tour here and you’re going to hear about enslavement.” In 1999, Congress took the extraordinary step of advising the National Park Service to re-evaluate its Civil War sites and do a better job of explaining “the unique role that slavery played in the cause of the conflict.” But vestiges of the Lost Cause still haunt park property. In rural Northern Virginia, in the middle of a vast lawn, stands a small white clapboard house with a long white chimney—the Stonewall Jackson Shrine, part of the Fredericksburg & Spotsylvania National Military Park. The Confederate general died in the house in May 1863. “The tendency for the park historically has been to invite people to mourn Jackson’s death,” John Hennessy, the park’s chief historian, told us. He believes that the site should be more than a shrine, however. Visitors, Hennessey said, should learn that Jackson “led an army in a rebellion in the service of a nation that intended to keep people in bondage forever.” He went on, “The greatest enemy to good public history is omission. We are experiencing as a society now the collateral damage that forgetting can inflict.” A park ranger sitting in the gift shop rose to offer us a practiced talk that focused reverently on Jackson’s final days—the bed he slept on, the clock that still keeps time. The ranger said a “servant,” Jim Lewis, had stayed with Jackson in the small house as he lay dying. A plaque noted the room where Jackson’s white staff slept. But there was no sign in the room across the hall where Lewis stayed. Hennessy had recently removed it because it failed to acknowledge that Lewis was enslaved. Hennessy is working on a replacement. Slavery, for the moment, was present only in the silences. * * * During the Fall Muster at Beauvoir, the Jefferson Davis home, we met Stephanie Brazzle, a 39-year-old African-American Mississippian who had accompanied her daughter, a fourth grader, on a field trip. It was Brazzle’s first visit. “I always thought it was a place that wasn’t for us,” she said. Brazzle had considered keeping her daughter home, but decided against it. “I really do try to keep an open mind. I wanted to be able to talk to her about it.” Brazzle walked the Beauvoir grounds all morning. She stood behind her daughter’s school group as they listened to re-enactors describe life in the Confederacy. She waited for some mention of the enslaved, or of African-Americans after emancipation. “It was like we were not even there,” she said, as if slavery “never happened.” “I was shocked at what they were saying, and what wasn’t there,” she said. It’s not that Brazzle, who teaches psychology, can’t handle historic sites related to slavery. She can, and she wants her daughter, now 10, to face that history, too. She has taken her daughter to former plantations where the experience of enslaved people is a part of the interpretation. “She has to know what these places are,” Brazzle said. “My grandmother, whose grandparents were slaves, she told stories. We black people acknowledge that this is our history. We acknowledge that this still affects us.” The overarching question is whether American taxpayers should support Lost Cause mythology. For now, that invented history, told by Confederates and retold by sympathizers for generations, is etched into the experience at sites like Beauvoir. In the well-kept Confederate cemetery behind the library, beyond a winding brook, beneath the flagpole, a large gray headstone faces the road. It is engraved with lines that the English poet Philip Stanhope Worsley dedicated to Robert E. Lee: “No nation rose so white and fair, none fell so pure of crime.” This article is a selection from the December issue of Smithsonian magazine
297e0c2d705db7fa5e88be10bff9c0bb
https://www.smithsonianmag.com/history/covid-19-adds-new-snag-to-2020-census-count-native-americans-180975150/
COVID-19 Adds a New Snag to the 2020 Census Count of Native Americans
COVID-19 Adds a New Snag to the 2020 Census Count of Native Americans Earlier this year, before the COVID-19 crisis hit the United States, Native American officials representing New Mexico’s 23 tribes in Albuquerque met to discuss how they could avoid a repeat of 2010, when that year’s census sorely undercounted the nation’s indigenous population. Because of the inaccurate calculation of their population—the 2010 census undercounted Native Americans on reservations by 4.9 percent—tribal communities lost thousands of dollars in federal funding and risked losing representation for their states in the House of Representatives. Attendees at the Albuquerque retreat made plans to educate their tribes about the importance of the census using field organizers and social media and also discussed what’s worked and what didn’t in the past for different nations. Ahtza Dawn Chavez, the executive director of the Native American Voters Alliance Education Project, felt optimistic after the event. “We really felt like we had a really great game plan,” she says. But then COVID-19 hit and began to spread rapidly. The U.S. Census Bureau suspended on-the-ground field operations, and many reservations in the West and Southwest—including the Navajo Nation, the largest in the country—closed their borders to outside visitors and tourists in March, hoping to contain the disease. The situation presented a new hurdle in the time-proven challenge of getting an accurate count of Native Americans in this country. The first U.S. census was held in 1790 to determine how many seats each state should get in the House of Representatives. Mark Hirsch, a historian at Smithsonian’s National Museum of the American Indian, says there were two glaring deficits: enslaved workers from the African diaspora, who were considered only three-fifths of a person, and Native Americans on reservations, who were considered “Indians not taxed” and, therefore, not counted at all. It would be more than 100 years before the census began to count Native Americans on reservations, where the majority of them lived at that time. While census counts today provide communities with federal funding and resources, for decades Native Americans were counted by the federal government for less-virtuous reasons. In the late-19th century to early-20th century, Hirsch says, federal commissioners —not from the U.S. Census Bureau—canvassed reservations to determine the number of heads of households. They then allotted each household a plot of land, aiming to privatize what had been communal. The land that was leftover was then sold to white settlers. This was all part of an effort to force Native Americans to give up their land and culturally assimilate, Hirsch says. “The whole point was to turn native people into white farmers like everybody else that had private property,” Hirsch says. “They would be part of American culture. They would adhere to all the late-19th century, white, middle-class values of hard work, independence and ownership of property, etc. But it was also a way of taking collectively owned tribal land away from native people and selling it to white settlers. Thousands and thousands and thousands of acres of land were taken away from native people that way.” The U.S. Census Bureau began including Native Americans on reservations in its own count in 1900, but the breaking up of tribal lands for allotment was not outlawed by Congress until 1934. Although the information collected today in the census is confidential and used only to determine representation and federal funding, the effects of centuries of mistreatment by the government linger. Many Native Americans are understandably wary of handing over their information still. One of the biggest hurdles, Chavez says, is distrust. “When you think about the historical distrust of a lot of these tribal communities of federal agents coming into their communities and just knocking on doors and saying, ‘Who are you?’ and ‘Who lives here?’ you can quickly see why some of these counts have been low disproportionately,” she says. The current national self-response rate for the 2020 census is more than 61 percent, and most people have responded on the internet using a census ID they received in the mail that links their responses to their address. Some tribal areas have already matched the national self-response rate so far for 2020—or even exceeded it—but in some nations the rate is below 10 percent. The self-response rate for the Navajo Nation reservation, where over 173,000 people live, is below 2 percent. Many homes on reservations and in the surrounding rural areas don’t have city-style addresses or reliable internet, so census enumerators typically go door-to-door, updating addresses and leaving paper questionnaires, which can be mailed in. This process is known as “update leave,” and can run into challenges with getting accurate counts. Undercounting is at its worst in places that are difficult to map, whether because they are remote or because they do not use city-style addressing. Kewa Pueblo, where Chavez is from, for example, was undercounted by approximately 24 percent in the 1980 census due to the bureau’s faulty mapping method, which excluded parts of the reservation from the official census maps and left people uncounted. The undercount rate for Native Americans has gone down over the decades, but there’s still a long way to go. Tribes also keep their own enrollment records, but certain types of federal funding to build infrastructure and provide health care on reservations come only from the census count. In New Mexico, for every person that is not counted, the state loses $5,000 in federal funding per year—or $50,000 total per person until the next census in 2030. The census count on reservations determines how much federal money tribes get for services such as the Indian Housing Block Grant, which provides affordable housing, and the Indian Health Service, which serves 2.2 million Native Americans across the United States. Kevin Allis, CEO of the National Congress of American Indians, says he’s worried that the pandemic, in combination with historical census challenges, will lead to another undercount this year. In the Southwest, Native Americans are suffering disproportionate rates of COVID-19 infection. As of early June, the White Mountain Apache Tribe in Arizona had an infection rate of 6.6 percent, and the Navajo Nation, which spans parts of New Mexico, Arizona and Utah, had a rate of 2.852 percent. To compare, in New York City, the zip code with the highest infection rate is in East Elmhurst, Queens, at 4.358 percent. “The timing of this pandemic could not be worse for all these communities because representation and federal funding are all going to be tied to this dataset for the next decade,” Allis says. In April, Congress approved a 120-day extension to complete the census, with a new deadline in October, but that may not be enough time for Native American communities. Dee Alexander, the tribal affairs coordinator for the Office of Congressional and Intergovernmental Affairs for the U.S. Census Bureau, says her office began consulting with tribes in 2015 and appointing tribal liaisons, but they couldn’t anticipate the closing of borders back then. “We’re just trying to take things day by day,” Alexander says. “We didn’t expect this to happen with our operations. That question was asked specifically for Navajo: What if the borders are still closed after October 31?” The U.S. Census Bureau started the “update leave” process just days before suspending field operations in mid-March because of COVID-19, so most rural and tribal areas had not received their census forms. Field operations resumed in New Mexico in May, but not on reservations, where tribal leaders will decide when it is safe to open their borders. “Once this resumes, we're going to pretty much be, for the most part, at ground zero,” Chavez says. One of the most important things the Census Bureau can do, Allis says, is to hire more Native Americans to be enumerators in their own communities. This year, however, the applications moved almost entirely online, making it harder for people on reservations without reliable internet to apply. “Non-natives trying to navigate through a reservation community is really difficult,” Allis says. “They're not familiar with the environment. They're not familiar with the traditions and customs, and so it's really important that the enumerators be from the community or from an American Indian or Alaska Native community and familiar with the nuances that exist on tribal reservations.” Iakowi:he’ne’ Oakes is the executive director of the American Indian Community House in New York City, which received funding from the city for census outreach efforts. The organization got a late start because they didn’t receive their funding until February. When applying, Oakes asked for about $120,000. She received only $50,000 out of a total of $19 million given to community-based organizations in the city. Oakes, who is part of the Rotinoshonni/Mohawk Nation, was disappointed, but not surprised. “This is the number one place where erasure occurs, the census,” she says. The Census Bureau did not find a significant undercount of Native Americans and Alaska Natives in urban communities in 2010. Across the United States, 70 percent of Native Americans live in metropolitan areas, with New York City as home to the largest population—over 111,000. While the census count is historically low on reservations, COVID-19 may also make it low among these Native Americans living in urban communities. Oakes closed down her office in early March, days before the city and state of New York went on pause. Her first concern was the health of her community, many of whom are particularly vulnerable to COVID-19. “In New York City, there are a lot of homeless natives,” Oakes says. “A large percentage of our community is low income and homeless. So that's even more difficult for us to access them, to get them counted. A place that we had strategized to work with them was our events because usually the homeless natives show up to our events. But we can't have any events. That was our main work plan, and now it's social distancing.” Complicating her plans further, the funding they received only runs through the end of June. During the 120-day extension, they hope to set up a mobile Wi-Fi pop-up center to fill out the census questionnaire online on Governors Island, where the organization usually hosts events and performances. They’ll continue promoting the census on social media as well. “People in our community need to understand that we would be doing a service to the federal government to not get the numbers — obviously a disservice to us, but a service to those racist regimes that would prefer keeping us under the rug somewhere, starving and struggling,” Oakes says. As cities and states begin to open up and census field operations resume, many reservations remain closed and Native American communities remain cautious. Their population has already been diminished, in part, by disease. In New Mexico, Chavez still has hope for achieving an accurate count, but the obstacles are mounting. “We’re doing everything we can to make sure everyone is counted, but we keep having more challenges thrown our way,” she says.
78e6722c000fd9ddc7c059775a10ee1d
https://www.smithsonianmag.com/history/crackerjack-lineup-baseball-memorabilia-drives-home-games-american-essence-180969657/
This Crackerjack Lineup of Baseball Memorabilia Drives Home the Game’s American Essence
This Crackerjack Lineup of Baseball Memorabilia Drives Home the Game’s American Essence Baseball’s fidelity to its past easily outdistances that of any other sport. Not only are today’s players still compared to Babe Ruth, Honus Wagner and Walter Johnson, stars of the early 20th century, but baseball’s structure and rules are largely the same as they were more than a century ago. That connection is made especially vivid through the rediscovered 1857 “Laws of Base Ball,” a 14-page document, dubbed the sport’s Magna Carta, and making its first appearance in a major exhibition, at the Library of Congress. The revered artifact is on loan from Hayden Trubitt, a lifelong fan of the sport, who bought it at auction in April 2016 for $3.26 million, after taking out a $1 million mortgage on his home to do so. Baseball historians were aware that an 1857 convention of New York-area clubs, initiated by the Knickerbocker Base Ball Club, had standardized the rules of play. What they did not know for more than a century was that the document with its proposed and finalized rules had survived. It made its debut without fanfare in a 1999 Sotheby’s auction. The winning bidder unwittingly purchased the document as part of a large collection of maps. Authentication came 16 years later, leading up to the Trubitt sale when John Thorn, the official historian of Major League Baseball, labeled it the “Magna Carta of baseball.” “The provenance is impeccable,” Thorn says, “and it stands to reason that the Laws, as printed in the newspapers of the day had to have been based on a set of handwritten proposals from the Knickerbocker delegation, which called the convention into being.” The document lays out the core of baseball—that the bases would be 90 feet apart; that a game would have nine innings; and that there would be nine players to a side. Former player Daniel ‘Doc’ Adams, elected as the presiding officer of the convention, authored the laws, which are exhibited along with two earlier drafts—the 1856 proposed Laws of Base Ball and the 1856-57 Rules for Match Games of Base Ball, which together formed the basis for the 1857 Laws). Other rules would be put in their modern form decades later—the pitching distance was set at its current distance in 1893—but it was with this document that baseball became the first organized sport in the United States. “These documents form a treasured part of Americana because baseball is our national game, to this very day,” says Thorn. Observing that the manuscript includes notes of the deliberations written into the margins in real time, or “history as it is being made,” Trubitt, who has no collecting background or aspirations, speaks passionately about his find. “It would be hard to define the United States culturally without sports,” he says. “And that’s entirely on the basis of organized sports. The way baseball became organized in 1857 was through an amazingly American and democratic method. It was a convention of, by and for the players, with all views being taken into account in the amendments and voting. It wasn’t like anybody commanded this all to happen, as in college football. It is really remarkable and touching. It is an American story.” David Mandel, chief of the interpretive programs office at the Library of Congress, says that the exhibition team chose to focus on the idea of baseball as community rather than emphasizing the sport’s chronology. “It’s a thematic narrative,” says Mandel. “It’s about the origins of the game and also the expanding inclusivity in terms of who’s playing, about the culture of the ballpark and commercial aspects of the sport, and also a bit about the art and science of the game.” In the section titled "Who’s Playing," an uncut sheet of baseball cards of members of the Washington Base Ball Club from 1887 complements an uncut sheet of baseball cards from 1994. “You see that some of the poses are comparable,” says Susan Reyburn, curator of the exhibit. “Players have moved from a studio in 1887, where they were posing for photos while standing on a floor with floral carpeting, a paper second base, and a ball hanging from a string to where you see pictures which were taken on the field. On the 1994 cards, you can see the incredible diversity—this is no longer the all-white Washington Base Ball Club. You’re seeing every manner of baseball player included in this other set.” A heartfelt 1950 handwritten letter to Branch Rickey from Jackie Robinson, the first African-American to play in the major leagues, thanks the executive who gave Robinson the opportunity and changed the game forever. “It has been the finest experience I have had being associated with you and I want to thank you very much for all you have meant not only to me and my family but to the entire country and particularly the members of our race,” wrote Robinson. A Rockford Peaches uniform belonging to pre-eminent base stealer Dottie Ferguson Key, who played in the All-American Girls Professional Baseball League from 1945 through 1954 is a highlight. The incredibly short skirt shows what little protection she had for her dirt sliding skids—she stole 461 bases in 950 games. A 1974 print by photographer Bettye Lane, titled Little League Tryouts for Females, New Jersey is also a remarkable tribute to the young girls who finally became eligible to play in 1974. Among the various pieces of equipment on loan from the Baseball Hall of Fame are Babe Ruth’s shoes, which look more like something a coal miner would wear than any kind of athletic footwear. But what is just as striking is Babe Ruth’s auxiliary agreement from 1921, laying out how he could earn various monetary performance incentives all while his ability to change teams was restricted by baseball’s reserve clause. The same principle which legally bound players to their respective teams was embodied in an 1892 Western League contract, also on display. “This is what baseball players spend the next century fighting against,” says Reyburn. “One of the themes that runs through baseball is the players trying to fight for their liberty, here in the freest county in the world. And it’s right here in this very innocuous-looking document. The reserve clause is going to cause strikes and many battles between players and owners through the 1970s. There it is, in very wordy language, which basically says, ‘we own you.’” A 27’ high grandstand, which attendees can go around and through, was created by a design company to provide a physical manifestation of what it is like to come together in the stands. “The way we would define a community for the purposes of this exhibit, in the United States, when the weather is nice, on any given day, people are playing baseball or softball,” says Mandel. “From Omaha to Oakland, from Albany to Atlanta. Baseball is a part of the fabric of American life that way, with the quotidian nature of it.” Even while going back to baseball’s roots, the Library of Congress exhibition connects to the present. A children’s book from 1787 titled A Little Pretty Pocket Book, first printed in England in 1744, shows figures standing by posts, which function as bases, and includes the first mention of the sport in print along with a now famous verse: “Base-Ball/The Ball once struck off/Away flies the Boy/To the next destin’d Post/and then Home with Joy.” The pairing in the exhibit with H Is For Home Run, a 2009 children’s book, underscores that baseball books for children have been produced for more than two centuries. “Unlike other organized sports, baseball has been with us from the beginning of the United States, as an activity,” says Reyburn. “I think there is a feeling that even though football is something of the national game, baseball is the national pastime. Even now. More people are playing baseball and softball than any other sport. Baseball is sort of in our DNA, because from the 1780s, whether we realize it or not, the term ‘baseball’ has been here, and bat and ball games have been here. With the additions to baseball that Americans have made over the generations, I think there’s this feeling of ownership. We made this folk game our own.” “Baseball Americana” is on view at the Library of Congress in Washington, D.C. through June 2019. John N. McMurray will visit the Smithsonian October 1, 2018 for an evening program with Smithsonian Associates to examine how the World Series came to be, along with a fascinating replay of highlights from Series history. Purchase tickets here. John N. McMurray's writing on baseball history has been published widely. He chairs both the Deadball Era Committee and the Oral History Committee of the Society for American Baseball Research (SABR).
b7521d8fd7093b01b8e35e2c627dd8ba
https://www.smithsonianmag.com/history/crowdsourcing-project-aims-document-many-us-places-where-women-have-made-history-180974535/
Crowdsourcing Project Aims to Document the Many U.S. Places Where Women Have Made History
Crowdsourcing Project Aims to Document the Many U.S. Places Where Women Have Made History For nearly 30 years, Alaskan Ahtna Athabascan elder Katie John awaited resolution to her peaceful battle over Native subsistence rights. The legal dispute—centering on her family’s right to fish in Batzulnetas, a historic village and fish camp in Wrangell-St. Elias National Park—made it all the way up to the U.S. Supreme Court. The Court's ruling cleared the way for the subsistence fishing rights of many Alaska Native to be included under federal subsistence protection. Although John died in 2013 before litigation was complete, her 2014 win was a victory for Native Americans everywhere. Today, the fish camp remains a testament to John's life work, and it represents just one of the many sites where women's history and achievements happened, often with no official sign or record recognizing their importance. Since mid-January, the National Trust for Historic Preservation has been crowdsourcing places like the fish camp for its 1,000 Places Where Women Made History, and the process to submit is simple. Anyone can log an online entry, which consists of a photo, as well as a short paragraph about the U.S.-based property and its location. “This is our way of bringing people together to tell us what are the places and stories that matter to them,” says Chris Morris, a National Trust senior field officer who's spearheading the campaign. Through submissions from local preservation societies, community organizations, and everyday people, they've already compiled more than 750 sites. Some, like the fish camp, may not have much recognition of their role in history while others have been named National Historic Landmarks. “Although 2020's 100th anniversary of women's suffrage is the impetus for this work,” says Morris, “we also wanted to use the project to fully honor those many female leaders related to American history and culture.” According to Morris, the 1,000 Places project is part of a larger mission of the Trust’s to preserve women’s history. The Trust encourages local organizations to take direct action in preserving buildings and homes where women have “made a stand, raised their voice, and found the courage to change the world,” she says, and identifies historic sites that recognize women as part of its annual 11 Most Endangered Historic Places list, which in 2019 included the Excelsior Club in Charlotte, North Carolina—a once-thriving hub of the city's African American social scene—and Nashville's Music Row. The National Trust also operates 27 of its own historic sites at which they're working to bring to light the many amazing women associated with these places. The Farnsworth House in Plano, Illinois, for instance, was designed by famed modernist architect Ludwig Mies van der Rohe, but it was native Chicagoan and doctor Edith Farnsworth who commissioned it. “So this year Farnsworth House is shifting its perspective to tell the story of the house from her point of view,” says Morris. The ever-growing list of 1,000 Places Where Women Made History currently includes everything from homes where pioneering women once lived, buildings where specific events that involved them occurred, and where women-led accomplishments happened. It includes spots like the former home of prominent investigative journalist Ida Tarbell in Titusville, Pennsylvania; the historic Auditorium Theatre in Chicago, saved through a fundraising campaign led by Beatrice Spachner; and Trumpet Records in Jackson, Mississippi, the former work site of a young record producer named Lillian McMurry, who recorded both black and white artists during the height of Mississippi segregation. “We want to reveal those sort of lesser-known and untold stories, because we recognize that women's history is America's history,” says Morris. “This crowdsourcing effort has been very successful in revealing such underappreciated tales, ones of women’s vision, courage and leadership countrywide. They make up the majority of our entries. They’re tales of thinkers, artists, scientists, entrepreneurs...those women who have really shaped the nation that we are today, and who continue to help us to move forward.” One of the Trust’s main goals with this project is to help a new generation of Americans, especially young women, see their own potential in the history of these places, says Morris. “We also will encourage everyone who submitted an entry to consider applying for funding from our many grant programs,” she says, “to support the broader interpretation and long-term preservation of these places where women made history.” Here are six lesser-known sites in the U.S. where women made history. Most of them are recognized in the 1,000 Places project, and all are on the Trust’s radar for renovation and reuse in some capacity. Though each is in various stages of preservation and redevelopment, they're all moving forward as a testament to women's achievements and inspiration for new stories to come. Located on a residential block in Miami's upscale Coconut Grove neighborhood, this uninhabited wood-framed and T-shaped cottage has a special place in American history, as the former home of Marjory Stoneman Douglas, a journalist, author and conservationist known as the “Grand Dame of the Everglades.” (She may sound familiar, too, as the namesake of the high school in Parkland, Florida, where 17 people were killed in a mass shooting in 2018.) Douglas published her seminal book, The Everglades: River of Grass, highlighting Florida's endlessly diverse subtropical wilderness and its need for ongoing preservation, in 1947. A month later, 20 percent of the Everglades’ southernmost portion became a national park. Douglas also founded the still-thriving Friends of the Everglades—an activist organization dedicated to protecting the landscape—in 1970, and often held meetings for conservationists at her Coconut Grove home, where she lived from 1926 until 1998. The Land Trust of Dade County currently oversees the property, which became a National Historic Landmark in 2015, and is working with other local and national preservation organizations for a reuse plan that continues Stoneman's legacy as an environmentalist, while also being respectful of the community that surrounds it. One possibility, says Morris, is to use the property as a residency where scientists can come to continue their research on environmental issues and climate change. Pauli Murray was both a civil rights and women's rights activist, an author, lawyer and member of the LGBTQ community, as well as the first African American woman to be ordained as an Episcopal priest. She spent her formative years in this one-and-a-half-story home, built by her grandfather, alongside her grandparents and aunts—all of whom helped raise Murray. In 1944, this descendent of both enslaved laborers and slave holders graduated first in her class at Howard University. Murray later received a Masters of Law degree from U.C. Berkeley in 1945, and in 1947 was named one of 10 “Young Women of the Year” by Mademoiselle magazine. She was also a founding member of the National Organization for Women (NOW) Foundation, which tackles a wide range of women's rights issues, from economic justice to reproductive rights. Murray's Durham childhood home has been a National Historic Landmark since 2016, and is both an entry on the National Trust's crowdsourcing campaign as well as one of its National Treasures. The Duke Human Rights Center at the Franklin Humanities Institute runs the Pauli Murray Project, which oversees the property, renovated it and is preparing to open it to the public as the the Pauli Murray Center for History and Social Justice later this year. In 1915, Japanese immigrants Jukichi and Ken Harada wanted to purchase a home in Riverside, but the California Alien Land Law of 1913 prevented them from doing so. Instead, the couple acquired their modest Lemon Street property by putting it in the name of their three young children—a move that soon became a focal point for the groundbreaking legal case California v. Harada. Under the 14th Amendment, the Haradas won the right to keep their 1884 home, though their lives would never be the same. In 1942, the entire family was relocated to Japanese internment camps where both Jukichi and Ken died. However, their youngest daughter Sumi returned to the Riverside home in the wake of World War II, taking in as boarders other Japanese families who'd lost their properties. Sumi resided at what's now known as Harada House until 1998, during which time she preserved many of the home's furnishings and fixtures, and kept a wealth of family heirlooms, including kimonos featuring the Harada family crest, personal letters and kitchenware. She also saved a message that her brother scribbled on a bedroom wall on the day his family was forced into a relocation center. Today the Riverside Metropolitan Museum oversees the home, which Jukichi had transformed from a single-story saltbox into a multi-story space, and is working to both restore it and turn it into an interpretive center highlighting the Harada story—one of lost city rights, a fight against racial discrimination, and immigrants. The property has been a National Historic Landmark since 1990. On the famous San Francisco corner of Haight and Ashbury streets—the heart of the 1960s counterculture movement—stands the Doolan-Larson building, a mixed-use, multi-story property built in the 20th century. This Colonial Revival-style structure, which survived the city's 1906 earthquake before being elevated to add storefronts, became home to San Francisco's first-ever hippie boutique. Twenty-four-year-old Peggy Caserta opened this mod clothing store, called Mnasidika (its name a shout-out to The Songs of Bilitis, a French book of lesbian poetry from the late 19th century), in 1965 and ran it until 1968, during which time it was a pivotal part of the Haight-Ashbury's counterculture scene. Caserta herself was bisexual—she was Janis Joplin's lover until Joplin's death in 1970—and according to Levi Strauss & Co., it was at Mnasidika that Jimi Hendrix developed his iconic Flower Child style. Caserta is also credited with convincing Levi Strauss to create bell-bottom jeans, which she then sold at Mnasidika and became a seminal part of '60s fashion. When the property's owner Norman Larson died in 2018, he donated the Doolan-Larson building to San Francisco Heritage. Mnasidika’s original storefront—now a jewelry store and barber shop—remains largely as it was during the Summer of Love.* Though not yet on the list of places “Where Women Made History,” it is a part of the Trust's National Treasures. San Francisco Heritage and other preservation groups are currently looking at ways to reuse the structure in telling the stories of San Francisco's counterculture movement, including those of women like Caserta, as well as to highlight both its overall impact and continued relevance today. Another addition to the National Trust's 100 National Treasures list, Villa Lewaro was the summer home of Madam C.J. Walker (born Sara Breedlove), an early 20th century entrepreneur who made a fortune in developing hair products for African American women. Walker, who is considered the first African American female millionaire in the U.S., is the subject of the new Netflix TV series, “Self Made,” starring Octavia Spencer as Walker. Along with being a businesswoman, Walker was a philanthropist and political and social activist. She occupied the 34-room, Italianate-style Villa Lewaro from 1918 to 1919, and though it's not currently open to the public, visitors can take a virtual tour of the estate led by Walker's great-great granddaughter, A'Lelia Bundles. The New Voices Foundation—created to empower entrepreneur women of color— acquired the property in 2018 and is working toward turning it into a “think tank,” according to New Voice's founder Richelieu Dennis, “to foster entrepreneurship for present and future generations.” *Editor's Note, March 30, 2010: A previous version of this article incorrectly stated that the hippie boutique Mnasidika in San Francisco was in a storefront now occupied by a t-shirt shop, when, in fact, it was in a storefront now occupied by a jewelry store and barber shop. The story has been edited to correct that fact. Laura Kiniry is a San Francisco-based freelance writer specializing in food, drink, and travel. She contributes to a variety of outlets including American Way, O-The Oprah Magazine, BBC.com, and numerous AAA pubs.
849326a4b2d4cabc5848e22ed45be4d9
https://www.smithsonianmag.com/history/cuyahoga-river-caught-fire-least-dozen-times-no-one-cared-until-1969-180972444/
The Cuyahoga River Caught Fire at Least a Dozen Times, but No One Cared Until 1969
The Cuyahoga River Caught Fire at Least a Dozen Times, but No One Cared Until 1969 It was the summer of 1969, and recent high school graduate Tim Donovan needed a job to pay his college tuition. When it came to well-paid summer work in Cleveland, there was one good place to look: the steel mills. Donovan went to work as a hatch tender for Jones & Laughlin Steel, standing at the top of machines stationed along the river to help unload ore carriers. It was his first real interaction with the Cuyahoga River, and the experience didn’t endear him to it. “The river was a scary little thing,” Donovan says. “There was a general rule that if you fell in, God forbid, you would go immediately to the hospital.” The water was nearly always covered in oil slicks, and it bubbled like a deadly stew. Sometimes rats floated by, their corpses so bloated they were practically the size of dogs. It was disturbing, but it was also just one of the realities of the city. For more than a century, the Cuyahoga River had been prime real estate for various manufacturing companies. Everyone knew it was polluted, but pollution meant industry was thriving, the economy was booming, and everyone had jobs. To the surprise of no one who worked on the Cuyahoga, an oil slick on the river caught fire the morning of Sunday, June 22, 1969. The blaze only lasted about 30 minutes, extinguished by land-based battalions and one of the city’s fireboats. It caused about $50,000 in damage to railroad bridges spanning the river and earned a small amount of attention in the local press. The fire was so small and short-lived that no one managed to get a single photo of it. For Donovan, the summer ended uneventfully and he went off to school without having thought much further on the state of Lake Erie or the Cuyahoga River. What happened next was the real surprise. Time magazine published an article on the fire—with an accompanying photo from an incident in 1952. National Geographic featured the river in their December 1970 cover story “Our Ecological Crisis” (but managed to get the date of the fire wrong). Congress established the Environmental Protection Agency in January 1970, for the first time creating a federal bureau to oversee pollution regulations. In April 1970, Donovan was one of 1,000 students marching down to the river for the country’s first Earth Day. The nation, it seemed, had suddenly woken up to the realities of industrial pollution, and the Cuyahoga River was the symbol of calamity. But on the day of the fire, it had meant nothing to the masses. Only in the following months and years did the fire gain its strange significance. As historians David and Richard Stradling write, “The fire took on mythic status, and errors of fact became unimportant to the story’s obvious meaning. … Clearly this transformative fire must have been massive; the nation must have seen the flames and been appropriately moved. Neither is true.” ********** The Civil War turned Cleveland into a manufacturing city almost overnight. The Cuyahoga River, just south of the city’s downtown area, snaking for 100 miles across Ohio and emptying into Lake Erie, proved the perfect place for factories to set up camp. American Ship Building, Sherwin-Williams Paint Company, Republic Steel and Standard Oil all rose up from Cleveland, and the river bore the toxic legacy of their success. By the 1870s, the river had served as an open sewer and dump site for long enough that it was already threatening the city’s water supply. In 1922, engineers at the Water Department of Cleveland did tests of the city’s drinking water to respond to claims that the water tasted medicinal or like carbolic acid. Their findings: “The polluted water of the Cuyahoga River reached the water works intakes, and this polluted water contained the material which caused the obnoxious taste.” Everyone knew the river was polluted, but nobody much cared. If anything, it was a badge of honor. As David Newton writes in Chemistry of the Environment, “Fundamentally this level of environmental degradation was accepted as a sign of success.” In 1868, 1883, 1887, 1912, 1922, 1936, 1941, 1948 and 1952 the river caught fire, writes Laura La Bella in Not Enough to Drink: Pollution, Drought, and Tainted Water Supplies. Those are some of the incidents we’re aware of; it’s hard to say how many other times oil slicks may have ignited, as press coverage and fire department records were both inconsistent. But not all the fires were as innocuous as that of 1969. Some caused millions of dollars’ worth of damage and killed people. But even with the obvious toll on the landscape, regulation of industry was limited at best. It seemed more important to keep the economy booming, the city growing and people working. This attitude was reflected in cities around the country. The Cuyahoga was far from the only river to catch fire during the period. Baltimore, Philadelphia, San Francisco, Buffalo and Galveston all used different methods to disperse oil on their waters in order to prevent fires. But the tide began to turn in the 1950s, according to the Stradlings. Between 1952 and 1969, Cleveland lost about 60,000 manufacturing jobs. Deindustrialization took hold alongside the Civil Rights movement and protests against the Vietnam War. “Over the years, Clevelanders were hardly complacent about the burning river, but not until the 1970s did they begin to think of its meaning in anything other than economic turns,” the Stradlings write. “That the Cuyahoga fire evolved into one of the great disasters of the environmental crisis tells us something about Americans’ growing suspicion of industrial landscapes, a suspicion encouraged by the decreasing benefits they derived from such places.” By 1968, the city was actively trying to clean up the river. That year, voters approved a $100 million bond program to fund cleanup, and the city attempted to improve its sewage system so as not to pollute the lake. After the 1969 fire, Cleveland’s mayor Carl Stokes, the first African-American elected to the position in any major American city, worked with his brother, Louis, in Congress to push for environmental regulation. Though the '69 fire was relatively small, the two brothers helped shape public perception of it as a turning point. “The story goes that it was the 1969 river fire that directly led to the establishment of the Environmental Protection Agency, but I think it was a little bit more complicated than that,” says Rebekkah Rubin, a public historian who collected oral histories for the 50th anniversary of the fire. “But for people who weren’t paying much attention to environmental advocacy, it’s easy to get behind the cleanup of a river that’s on fire.” Over the years, the river transformed from a dump site to a place for recreation. Today, Rubin sees people on the river kayaking, fishing and cruising on stand-up paddle boards, though she admits that such recreations are still not available to everyone in the city. “The river doesn’t flow through the neighborhoods that tend to be more low income and more segregated, but I think it should be a resource available for all Clevelanders.” Despite its new life, the river still shows signs of its former degradation. In 2018, the Cleveland Plain Dealer reported that EPA scientists tested dozens of sites along the river bottom and found that polychlorinated biphenyl (PCB) levels remain dangerously high. Other scientists have cautioned that the river is still “burning” with viruses, bacteria and parasites, including Salmonella, Clostridium, enteroviruses, Giardia and hepatitis A. But even with these remaining issues, the Cuyahoga is unrecognizable compared to what it was a mere 50 years ago—as is the case with numerous waterways around America. Donovan, who today works as the director of the nonprofit Canalway Partners, has spent years working to build a path along the Cuyahoga River Canal that will make it more accessible to all Cleveland residents. He sees the river differently now, as if it and the city have undergone an identity crisis and are now settling into their new roles. “As the river gets cleaned up, entertainment options become more viable,” Donovan says. “Nobody is going to be sitting on a river with bloated rats floating by. It reflects the changing perception of what’s important here.” And for Donovan and Rubin alike, it’s a change worth celebrating, even if there’s still work to be done. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
2eeebba5da2028c9662e72237b4dbc22
https://www.smithsonianmag.com/history/dahomeys-women-warriors-88286072/?no-ist=
Dahomey’s Women Warriors
Dahomey’s Women Warriors It is noon on a humid Saturday in the fall of 1861, and a missionary by the name of Francesco Borghero has been summoned to a parade ground in Abomey, the capital of the small West African state of Dahomey. He is seated on one side of a huge, open square right in the center of the town–Dahomey is renowned as a “Black Sparta,” a fiercely militaristic society bent on conquest, whose soldiers strike fear into their enemies all along what is still known as the Slave Coast. The maneuvers begin in the face of a looming downpour, but King Glele is eager to show off the finest unit in his army to his European guest. As Father Borghero fans himself, 3,000 heavily armed soldiers march into the square and begin a mock assault on a series of defenses designed to represent an enemy capital. The Dahomean troops are a fearsome sight, barefoot and bristling with clubs and knives. A few, known as Reapers, are armed with gleaming three-foot-long straight razors, each wielded two-handed and capable, the priest is told, of slicing a man clean in two. The soldiers advance in silence, reconnoitering. Their first obstacle is a wall—huge piles of acacia branches bristling with needle-sharp thorns, forming a barricade that stretches nearly 440 yards. The troops rush it furiously, ignoring the wounds that the two-inch-long thorns inflict. After scrambling to the top, they mime hand-to-hand combat with imaginary defenders, fall back, scale the thorn wall a second time, then storm a group of huts and drag a group of cringing “prisoners” to where Glele stands, assessing their performance. The bravest are presented with belts made from acacia thorns. Proud to show themselves impervious to pain, the warriors strap their trophies around their waists. The general who led the assault appears and gives a lengthy speech, comparing the valor of Dahomey’s warrior elite to that of European troops and suggesting that such equally brave peoples should never be enemies. Borghero listens, but his mind is wandering. He finds the general captivating: “slender but shapely, proud of bearing, but without affectation.” Not too tall, perhaps, nor excessively muscular. But then, of course, the general is a woman, as are all 3,000 of her troops. Father Borghero has been watching the King of Dahomey’s famed corps of “amazons,” as contemporary writers termed them—the only female soldiers in the world who then routinely served as combat troops. When, or indeed why, Dahomey recruited its first female soldiers is not certain. Stanley Alpern, author of the only full-length Engish-language study of them, suggests it may have been in the 17th century, not long after the kingdom was founded by Dako, a leader of the Fon tribe, around 1625. One theory traces their origins to teams of female hunters known as gbeto, and certainly Dahomey was noted for its women hunters; a French naval surgeon named Repin reported in the 1850s that a group of 20 gbeto had attacked a herd of 40 elephants, killing three at the cost of several hunters gored and trampled. A Dahomean tradition relates that when King Gezo (1818-58) praised their courage, the gbeto cockily replied that “a nice manhunt would suit them even better,” so he drafted them drafted into his army. But Alpern cautions that there is no proof that such an incident occurred, and he prefers an alternate theory that suggests the women warriors came into existence as a palace guard in the 1720s. Women had the advantage of being permitted in the palace precincts after dark (Dahomean men were not), and a bodyguard may have been formed, Alpern says, from among the king’s “third class” wives–those considered insufficiently beautiful to share his bed and who had not borne children. Contrary to 19th century gossip that portrayed the female soldiers as sexually voracious, Dahomey’s female soldiers were formally married to the king—and since he never actually had relations with any of them, marriage rendered them celibate. At least one bit of evidence hints that Alpern is right to date the formation of the female corps to the early 18th century: a French slaver named Jean-Pierre Thibault, who called at the Dahomean port of Ouidah in 1725, described seeing groups of third-rank wives armed with long poles and acting as police. And when, four years later, Dahomey’s women warriors made their first appearance in written history, they were helping to recapture the same port after it fell to a surprise attack by the Yoruba–a much more numerous tribe from the east who would henceforth be the Dahomeans’ chief enemies. Dahomey’s female troops were not the only martial women of their time. There were at least a few contemporary examples of successful warrior queens, the best-known of whom was probably Nzinga of Matamba, one of the most important figures in 17th-century Angola—a ruler who fought the Portuguese, quaffed the blood of sacrificial victims, and kept a harem of 60 male concubines, whom she dressed in women’s clothes. Nor were female guards unknown; in the mid-19th century, King Mongkut of Siam (the same monarch memorably portrayed in quite a different light by Yul Brynner in The King and I) employed a bodyguard of 400 women. But Mongkut’s guards performed a ceremonial function, and the king could never bear to send them off to war. What made Dahomey’s women warriors unique was that they fought, and frequently died, for king and country. Even the most conservative estimates suggest that, in the course of just four major campaigns in the latter half of the 19th century, they lost at least 6,000 dead, and perhaps as many as 15,000. In their very last battles, against French troops equipped with vastly superior weaponry, about 1,500 women took the field, and only about 50 remained fit for active duty by the end. None of this, of course, explains why this female corps arose only in Dahomey. Historian Robin Law, of the University of Stirling, who has made a study of the subject, dismisses the idea that the Fon viewed men and women as equals in any meaningful sense; women fully trained as warriors, he points out, were thought to “become” men, usually at the moment they disemboweled their first enemy. Perhaps the most persuasive possibility is that the Fon were so badly outnumbered by the enemies who encircled them that Dahomey’s kings were forced to conscript women. The Yoruba alone were about ten times as numerous as the Fon. Backing for this hypothesis can be found in the writings of Commodore Arthur Eardley Wilmot, a British naval officer who called at Dahomey in 1862 and observed that women heavily outnumbered men in its towns—a phenomenon that he attributed to a combination of military losses and the effects of the slave trade. Around the same time Western visitors to Abomey noticed a sharp jump in the number of female soldiers. Records suggest that there were about 600 women in the Dahomean army from the 1760s until the 1840s—at which point King Gezo expanded the corps to as many as 6,000. No Dahomean records survive to explain Gezo’s expansion, but it was probably connected to a defeat he suffered at the hands of the Yoruba in 1844. Oral traditions suggest that, angered by Dahomean raids on their villages, an army from a tribal grouping known as the Egba mounted a surprise attack that that came close to capturing Gezo and did seize much of his royal regalia, including the king’s valuable umbrella and his sacred stool. “It has been said that only two amazon ‘companies’ existed before Gezo and that he created six new ones,” Alpern notes. “If so, it probably happened at this time.” Recruiting women into the Dahomean army was not especially difficult, despite the requirement to climb thorn hedges and risk life and limb in battle. Most West African women lived lives of forced drudgery. Gezo’s female troops lived in his compound and were kept well supplied with tobacco, alcohol and slaves–as many as 50 to each warrior, according to the noted traveler Sir Richard Burton, who visited Dahomey in the 1860s. And “when amazons walked out of the palace,” notes Alpern, “they were preceded by a slave girl carrying a bell. The sound told every male to get out of their path, retire a certain distance, and look the other way.” To even touch these women meant death. While Gezo plotted his revenge against the Egba, his new female recruits were put through extensive training. The scaling of vicious thorn hedges was intended to foster the stoical acceptance of pain, and the women also wrestled one another and undertook survival training, being sent into the forest for up to nine days with minimal rations. The aspect of Dahomean military custom that attracted most attention from European visitors, however, was “insensitivity training”—exposing unblooded troops to death. At one annual ceremony, new recruits of both sexes were required to mount a platform 16 feet high, pick up baskets containing bound and gagged prisoners of war, and hurl them over the parapet to a baying mob below. There are also accounts of female soldiers being ordered to carry out executions. Jean Bayol, a French naval officer who visited Abomey in December 1889, watched as a teenage recruit, a girl named Nanisca “who had not yet killed anyone,” was tested. Brought before a young prisoner who sat bound in a basket, she: walked jauntily up to , swung her sword three times with both hands, then calmly cut the last flesh that attached the head to the trunk… She then squeezed the blood off her weapon and swallowed it. It was this fierceness that most unnerved Western observers, and indeed Dahomey’s African enemies. Not everyone agreed on the quality of the Dahomeans’ military preparedness—European observers were disdainful of the way in which the women handled their ancient flintlock muskets, most firing from the hip rather than aiming from the shoulder, but even the French agreed that they “excelled at hand-to-hand combat” and “handled admirably.” For the most part, too, the enlarged female corps enjoyed considerable success in Gezo’s endless wars, specializing in pre-dawn attacks on unsuspecting enemy villages. It was only when they were thrown against the Egba capital, Abeokuta, that they tasted defeat. Two furious assaults on the town, in 1851 and 1864, failed dismally, partially because of Dahomean overconfidence, but mostly because Abeokuta was a formidable target—a huge town ringed with mud-brick walls and harboring a population of 50,000. By the late 1870s Dahomey had begun to temper its military ambitions. Most foreign observers suggest that the women’s corps was reduced to 1,500 soldiers at about this time, but attacks on the Yoruba continued. And the corps still existed 20 years later, when the kingdom at last found itself caught up in the “scramble for Africa,” which saw various European powers competing to absorb slices of the continent into their empires. Dahomey fell within the French sphere of influence, and there was already a small French colony at Porto-Novo when, in about 1889, female troops were involved in an incident that resulted in a full-scale war. According to local oral histories, the spark came when the Dahomeans attacked a village under French suzerainty whose chief tried to avert panic by assuring the inhabitants that the tricolor would protect them. “So you like this flag?” the Dahomean general asked when the settlement had been overrun. “Eh bien, it will serve you.” At the general’s signal, one of the women warriors beheaded the chief with one blow of her cutlass and carried his head back to her new king, Béhanzin, wrapped in the French standard. The First Franco-Dahomean War, which ensued in 1890, resulted in two major battles, one of which took place in heavy rain at dawn outside Cotonou, on the Bight of Benin. Béhanzin’s army, which included female units, assaulted a French stockade but was driven back in hand-to-hand fighting. No quarter was given on either side, and Jean Bayol saw his chief gunner decapitated by a fighter he recognized as Nanisca, the young woman he had met three months earlier in Abomey as she executed a prisoner. Only the sheer firepower of their modern rifles won the day for the French, and in the battle’s aftermath Bayol found Nanisca lying dead. “The cleaver, with its curved blade, engraved with fetish symbols, was attached to her left wrist by a small cord,” he wrote, “and her right hand was clenched around the barrel of her carbine covered with cowries.” In the uneasy peace that followed, Béhanzin did his best to equip his army with more modern weapons, but the Dahomeans were still no match for the large French force that was assembled to complete the conquest two years later. That seven-week war was fought even more fiercely than the first. There were 23 separate battles, and once again female troops were in the vanguard of Béhanzin’s forces. The women were the last to surrender, and even then—at least according to a rumor common in the French army of occupation—the survivors took their revenge on the French by covertly substituting themselves for Dahomean women who were taken into the enemy stockade. Each allowed herself to be seduced by French officer, waited for him to fall asleep, and then cut his throat with his own bayonet. Their last enemies were full of praise for their courage. A French Foreign Legionnaire named Bern lauded them as “warrioresses… fight with extreme valor, always ahead of the other troops. They are outstandingly brave … well trained for combat and very disciplined.” A French Marine, Henri Morienval, thought them “remarkable for their courage and their ferocity… flung themselves on our bayonets with prodigious bravery.” Most sources suggest that the last of Dahomey’s women warriors died in the 1940s, but Stanley Alpern disputes this. Pointing out that “a woman who had fought the French in her teens would have been no older than 69 in 1943,” he suggests, more pleasingly, that it is likely one or more survived long enough to see her country regain its independence in 1960. As late as 1978, a Beninese historian encountered an extremely old woman in the village of Kinta who convincingly claimed to have fought against the French in 1892. Her name was Nawi, and she died, aged well over 100, in November 1979. Probably she was the last. What were they like, these scattered survivors of a storied regiment? Some proud but impoverished, it seems; others married; a few tough and argumentative, well capable, Alpern says, of “beating up men who dared to affront them.” And at least one of them still traumatized by her service, a reminder that some military experiences are universal. A Dahomean who grew up in Cotonou in the 1930s recalled that he regularly tormented an elderly woman he and his friends saw shuffling along the road, bent double by tiredness and age. He confided to the French writer Hélène Almeida-Topor that one day, one of us throws a stone that hits another stone. The noise resounds, a spark flies. We suddenly see the old woman straighten up. Her face is transfigured. She begins to march proudly… Reaching a wall, she lies down on her belly and crawls on her elbows to get round it. She thinks she is holding a rifle because abruptly she shoulders and fires, then reloads her imaginary arm and fires again, imitating the sound of a salvo. Then she leaps, pounces on an imaginary enemy, rolls on the ground in furious hand-t0-hand combat, flattens the foe. With one hand she seems to pin him to the ground, and with the other stabs him repeatedly. Her cries betray her effort. She makes the gesture of cutting to the quick and stands up brandishing her trophy…. She intones a song of victory and dances: The blood flows, You are dead. The blood flows, We have won. The blood flows, it flows, it flows. The blood flows, The enemy is no more. But suddenly she stops, dazed. Her body bends, hunches, How old she seems, older than before! She walks away with a hesitant step. She is a former warrior, an adult explains…. The battles ended years ago, but she continues the war in her head. Sources Hélène Almeida-Topor. Les Amazones: Une Armée de Femmes dans l’Afrique Précoloniale. Paris: Editions Rochevignes, 1984; Stanley Alpern. Amazons of Black Sparta: The Women Warriors of Dahomey. London: C. Hurst & Co., 2011; Richard Burton. A Mission to Gelele, King of Dahome. London: RKP, 1966; Robin Law. ‘The ‘Amazons’ of Dahomey.’ Paideuma 39 (1993); J.A. Skertchley. Dahomey As It Is: Being a Narrative of Eight Months’ Residence in that Country, with a Full Account of the Notorious Annual Customs… London: Chapman & Hall, 1874. Mike Dash is a contributing writer in history for Smithsonian.com. Before Smithsonian.com, Dash authored the award-winning blog A Blast From the Past.
293e2c4b692697d1fe5e4fa0706a1757
https://www.smithsonianmag.com/history/dancing-for-mao-123906664/
Dancing for Mao
Dancing for Mao Li Zhensheng heard singing followed by a burst of applause. Following the sounds led the photojournalist to a young girl with unusually fair hair tied in ponytails, dancing with her arms upraised and surrounded by smiling, clapping soldiers. They were at the Red Guard Stadium in Harbin, in northern China, along with hundreds of thousands of Communist Party cadres, workers, peasants and other soldiers who had gathered for a marathon conference on the teachings of Chairman Mao Zedong. This was 1968, nearly two years into the Cultural Revolution, Mao's attempt to purge Chinese society of supposed bourgeois elements and escalate his own cult of personality. The conferees seemed to be trying to outdo one another in their professions of love for their nation's leader. On April 28, the last day of the 23-day gathering, a 5-year-old kindergartner was performing the "loyalty dance," as it was known. In front of the soldiers in the stadium stands, she skipped in place and sang: No matter how close our parents are to us, they are not as close as our relationship with Mao How absurd, thought Li, who was then a photographer for the Heilongjiang Daily, a party newspaper. The girl certainly was lovely and eager to please, but the photojournalist found the excess of zeal discomforting. "They had to love him to the extreme," says Li, now 68 and retired. In the cult of Mao, everyone was expected to perform the loyalty dance—from miners to office workers to toddlers to old ladies whose feet had been bound. "The movements were always toward the sky—that way you could show how respectful you were to Mao," Li says. "Everyone knew how to dance it." Li shot six photographs of the scene, of which the Heilongjiang Daily published two. When the girl—instantly known as "Little Yellow Hair"—returned home to Dedu County (now Wudalianchi City), people came to the roadside to cheer her for bringing fame and honor to their town. Li kept on taking pictures—including those he called his "negative negatives": Red Guards shaving the head of a provincial governor because his hairline was too similar to Mao's; security forces shooting, point-blank, two accused counterrevolutionaries for publishing a flier the government deemed too pro-Soviet. These were scenes that China did not want the rest of the world—or, for that matter, its own people—to see. In the darkroom, Li would separate potentially dangerous negatives and hide them in his desk. When the time seemed right he would take them home for safer keeping, having cut a hiding space the size of a book in the floorboards of his one-room apartment. Even after the Cultural Revolution effectively ended with Mao's death, at age 82, in 1976, Li was wary about showing his more incendiary work. In 1980 he left the newspaper to teach at Beijing University's International Political Science Institute. In 1988, the organizers of a nationwide pho­tography competition—what Li says was China's first such undertaking as it opened up to the outside world—encouraged him to enter some of his pictures. Then-Defense Minister Zhang Aiping, who had been imprisoned for years during the Cultural Revolution, greeted the exhibition with the remark, "Let history tell the future." Li's pictures (which did not include "Little Yellow Hair") won the grand prize. "The authorities were shocked by the violence depicted in Li's images of public humiliations inflicted upon dignitaries and by the photographs of the executions," says Robert Pledge, co-founder of New York City photo agency Contact Press Images, which would collaborate with Li in publishing his life's work in the book Red-Color News Soldier. (Images from the book have been shown in ten countries, with exhibits scheduled for Hungary, Australia and Singapore later this year.) For his part, Li says he remained haunted by the people in his photographs. He wanted to know what had become of those who had survived; he wanted to connect with the families of those who hadn't. In 1998, he wrote an article for his former newspaper under the headline, "Where Are You, Little Girl Who Performed the Loyalty Dance?" A week later, he heard from Kang Wenjie. Kang still lived in Wudalianchi City, not far from the Russian border. She made a living selling wholesale clothes to Russian traders. She was married and had a 12-year-old son. Kang told Li she had been picked to represent her city those many years ago because she could sing and dance, but she hadn't even known that the dance she performed that day had a name. After Li told her about it, she used the very word in her reaction that he had thought in 1968: ke xiao—absurd. "I was merely a naive child who knew nothing," Kang, now 46, says today. "How could I become that well known after a dance?" Li says the story reminds him of the fable of the naked emperor's new clothes—here was a child who couldn't even read Mao's writings being held up as a model of Maoist thought. "During the Cultural Revolution," Li says, "no one dared to tell the truth." Even today, the truth about those dark days remains a delicate subject. Li's book has been published in six languages, but it is not available in China. Jennifer Lin covered China from 1996 to 1999 for the Philadelphia Inquirer, where she remains a reporter.
3bc699900eb899e8438a4080ade1cd3d
https://www.smithsonianmag.com/history/darwin-on-lincoln-and-vice-versa-48151291/
Darwin on Lincoln and Vice Versa
Darwin on Lincoln and Vice Versa Because Darwin and Lincoln are forever paired, thanks to their shared birthdate 200 years ago and the profound and lasting (but separate) influence of their ideas and actions, as Adam Gopnik explains, a question arises: What did they think of each other? In today's hyper-mediated, celebrity-saturated global village the world's leading biologist and the leader of the free world might be expected to meet at, say, the World Economic Forum in Davos, Switzerland (though we're not aware that Lincoln or Darwin skied), at a climate-policy summit or over pints at Bono's. But Darwin and Lincoln did not cross paths. And though a perusal of reliable sources suggests that the two did not mention each other by name in writing, there's evidence they were at least aware of one another's efforts. Darwin, a staunch abolitionist, as our Times of London, whose correspondent in the States was not sufficiently against slavery, Darwin wrote, and covered the war "detestably." Asa Gray between 1862 and 1865 referring to the Civil War, slavery or the "president." Darwin was not forthcoming about Lincoln and appeared to grow more pessimistic about the war as the years went on. On June 5, 1861, Darwin wrote to Gray: I never knew the newspapers so profoundly interesting. N. America does not do England justice: I have not seen or heard of a soul who is not with the North. Some few, & I am one, even wish to God, though at the loss of millions of lives, that the North would proclaim a crusade against Slavery. In the long run, a million horrid deaths would be amply repaid in the cause of humanity. What wonderful times we live in.... Great God how I shd like to see that greatest curse on Earth Slavery abolished. Lincoln issued the final Emancipation Proclamation on January 1, 1863. Eighteen days later, Darwin wrote to Gray, an abolitionist evidently more optimistic about the course of the war than Darwin: Well, your President has issued his fiat against Slavery— God grant it may have some effect.— ... I sometimes cannot help taking most gloomy view about your future. I look to your money depreciating so much that there will be mutiny with your soldiers & quarrels between the different states which are to pay In short anarchy & then the South & Slavery will be triumphant. But I hope my dismal prophecies will be as utterly wrong as most of my other prophecies have been. But everyone's prophecies have been wrong; those of your Government as wrong as any.— It is a cruel evil to the whole world; I hope that you may prove right & good come out of it. It cannot be said that Lincoln, for his part, gave Darwin that much thought. The one passage we turned up about Lincoln and evolution focuses on his interest in a book that preceded Darwin's On the Origin of Species by some 15 years. That was Vestiges of the Natural History of Creation, first published anonymously in 1844 by the Scottish journalist Robert Chambers. It presented a cosmic theory of evolution that lacked Darwin's key insight (the mechanism of natural selection), posited a biased view of human progress, was roundly criticized by scientists as mistaken about geology and other subjects, and in subsequent editions took pains to say it was perfectly compatible with Christian theology. Still, it did advance the idea to a wide audience that species we see today were not fixed but had descended from other forms, and the controversy it stirred gave Darwin pause. The following passage comes from the groundbreaking 1889 biography Springfield, Illinois. Herndon writes: For many years I subscribed for and kept on our office table the Westminster and Edinburgh Review and a number of other English periodicals. Besides them I purchased the works of Spencer, Darwin, and the utterances of other English scientists, all of which I devoured with great relish. I endeavored, but had little success in inducing Lincoln to read them. Occasionally he would snatch one up and peruse it for a little while, but he soon threw it down with the suggestion that it was entirely too heavy for an ordinary mind to digest. A gentleman in Springfield gave him a book called, I believe,"Vestiges of Creation," which interested him so much that he read it through. The volume was published in Edinburgh, and undertook to demonstrate the doctrine of development or evolution. The treatise interested him greatly, and he was deeply impressed with the notion of the so-called "universal law" evolution; he did not extend greatly his researches, but by continued thinking in a single channel seemed to grow into a warm advocate of the new doctrine. Beyond what I have stated he made no further investigation into the realm of philosophy. "There are no accidents," he said one day, "in my philosophy. Every effect must have its cause. The past is the cause of the present, and the present will be the cause of the future. All these are links in the endless chain stretching from the finite to the infinite. That's the extent of what's known about Lincoln's thoughts on evolution, says Michael Lind, a senior fellow at the New America Foundation and author of the 2004 book What Lincoln Believed. "Herndon's testimony suggests that Lincoln was not only familiar with the idea of evolution," Lind says in an email, "but convinced by it."
d8beed33c552fd78c203d457d419a040
https://www.smithsonianmag.com/history/dead-beneath-londons-streets-180970385/
The Dead Beneath London’s Streets
The Dead Beneath London’s Streets Grave robbers had gotten there first. Sometime in the 16th century, they ransacked the tomb for its gold and grave goods, leaving the bones behind and lid cracked. But five centuries later, on the southern banks of the Thames, in London’s Southwark neighborhood, the Roman sarcophagus was unearthed again, this time by construction workers building a new residential development. Weighing nearly three tons and buried sometime between 86 and 328 A.D., the stone sarcophagus contained the body of a woman believed to have been about 30 years old at the time of her death. The bones of an infant were found with her, but it’s unclear whether the woman and child were buried together. The sarcophagus dates to London’s earliest years, not long after the Romans planted the walled settlement of Londinium on the marshy north bank of the Thames in 43A.D. The sarcophagus, to the south of the settlement and across the river, was found just to the west of a Roman road, covered by centuries of human construction and detritus. It was the find of a lifetime for the archaeologists who worked on it. But in the course of London’s nearly 2,000-year history, perhaps it’s not so surprising at all. The sarcophagus, its occupants, and 40 years’ worth of London’s Roman burial finds are part of an exhibition at the Museum of London Docklands running until the end of October. “Roman Dead”, inspired by the sarcophagus’s discovery, explores how Roman Londoners treated death; many of the objects have never before been displayed. Some of the finds are grim, even for skeletons: four of the skulls on display came from a pit found near the London Wall (the Roman-built wall that once encircled the city) filled with more than 40 skulls of men between the ages of 18 and 35, all killed by blunt force trauma to the head. Others are mysterious: the skeleton of a dog, buried in her own grave with her collar but without her head; an iron ring welded in place around an arm, unclear whether it was done before or after death or why. The exhibition also seeks to show that London has been, from its founding, a center of trade, peopled by immigrants from across the known world. One of the skeletons, for example, belonged to a blue-eyed woman of black African ancestry who travelled to London via southern Mediterranean trade routes. She was just one of the nearly 60,000 residents the settlement boasted at the height of Rome’s power in Britannia. The exhibition underscores one of the most important and consistent sources of archaeological information under London’s streets: the bones. And there are a lot of bones. Though the population of Londinium declined after the Romans left in the fifth century, the city trundled on for two centuries more. Its fortunes changed with the renewed attention of the Saxons, who called it Lundenwic, and over the next millennia, it continued to attract people, power and trade. During the medieval period, people were buried in churchyards, of which there were more than 100 in the City of London. When the population was only around 15,000, as it was in 1100, burying people in the churchyard was sustainable. When it rose to 80,000 by the end of the 13th century, it became less so. And when people died in unimaginable numbers, as they did in during the plague years – in 1348, the Black Death killed around 40,000 people within months – parish cemeteries became dangerously crowded. The response was mass burial grounds in fields outside the city walls, but the city soon swallowed these, too. This history of London, punctuated by the ebb and flow of populations, means that the physical remains of countless Londoners sit just there, under the pavements. Glittering Terminal Five at Heathrow Airport? Construction uncovered fragments of a Neolithic monument, bronze spearheads, a Roman lead font, an early Saxon settlement, and medieval coins, evidence of 9,000 years of near-continuous human habitation. Just feet from the MI6 building – the one blown up in Skyfall – archaeologists discovered the oldest structure in London: 6,000-year-old Mesolithic timber piles stuck deep in the Thames foreshore, the remains of a structure that once sat at the mouths of the Thames and the River Effra. In the basement of Bloomberg’s new European headquarters in the heart of the City, there’s a modern shrine honoring an ancient temple, the Roman Mithraeum, built in 240 A.D. next to the river Walbrook to honor the Roman god Mithras. In the basement of a high-end hair salon in Leadenhall, just past the rows of chairs and mirrors, are the remnants of a Roman wall. London is a city built on bones, both figuratively and very literally. Luckily for archaeologists, the United Kingdom is one of few European countries that actively asks developers to balance the needs of the present against the preservation of the past. *** In the 1570s, the City of London was one square mile of squalor and wonder. Behind walls plotted by the Romans and defended by the Saxons, London’s 180,000 inhabitants breathed, ate, slept, defecated and died in a space denser than the most crowded cities of today. This was a London that needed somewhere to put all of these people. New buildings were going up where they could, made from timber, brick and stone “recycled” from existing structures (including any remaining Roman walls or ruins that hadn’t been picked over before). Clay for bricks could be dug from pits outside the walls and in 1576, a group of workmen were doing just that in an area of fields and orchards just beyond Bishopsgate, called Spitalfields. As they trawled through the deep earth with shovels and picks, separating the rocks from the clay, they made a discovery. “Many earthen pots, called Vrnae, were found full of Ashes, and burnt bones of men, to wit, of the Romanes that inhabited here,” writes John Stow in his 1598 Survay of London: Contayning the Originall, Antiquity, Increase, Modern estate and description of that cities. Stow, a Bishopsgate tailor-turned-documentarian of the life of the city, was there in the clay pit that day. He saw the urns, each containing the burnt bones and ashes of dead Romans and “one peece of Copper mony, with the inscription of the Emperour then raigning”. He saw “vials and other fashioned Glasses, some most cunningly wrought, such as I have not seene the like, and some of Christall, all which had water in them, nothing differing in clearnes, taste, or savour from common spring water, what so ever it was at the first: some of these Glasses had Oyle in them verie thicke, and earthie in savour, some were supposed to have balme in them, but had lost the virtue.” He saw smooth red earthenware dishes, with Roman letters stamped on the bottoms, and lamps decorated with Roman figures. And, of course, he saw bones. He’d heard reports of stone sarcophagi – just like the one found in Southwark – being dug up in the same field, and saw for himself the bones of people who’d been buried in timber coffins, the wood long since disintegrated, leaving only the long iron nails behind. The other men on the site, he said, declared that “the men there buried were murdered by drilling those nayles into their heads,” but he reasoned that explanation “unlikely” – the nails, he said, still had fragments of wood under their heads. He took home one of the nails, as well as the man’s lower jaw, “the teeth being great, sound, and fixed”. He also held onto an urn, with its bones and ashes, and a small pot in the shape of a hare squatting on her hind legs. Stow’s account demonstrates what makes London London: The past can’t stay buried in a city that’s always digging it up. It’s only been in the last century, though, that real effort has gone into preserving that past. Stow’s Spitalfields Roman burial site was uncovered at a time when, while there might have been a reverence for ancient remains and the stories they told, there was no mechanism for removing and investigating them. What was removed – human and material remains – ended up in private collections or, quite possibly, the rubbish. “There wasn’t such a feeling of, ‘Ooh, we must preserve this,’” says Meriel Jeater, curator of the Museum of London’s archaeology collection. “Later on, in the 17th century, during the rebuilding of London after the Great Fire, other Roman remains were found and they were recorded by antiquaries and kept in people’s collections… Christopher Wren [St. Paul’s architect] found Roman remains during the reconstruction of St. Paul’s Cathedral, and a Roman tombstone was found near Ludgate, people were very excited at the time.” But they didn’t really know what to do with what they found. In the 18th and 19th centuries, as cabinets of curiosities gave way to museums and interest in classical antiquity reached a peak inspired by the Romantic movement, academics turned their attention to these finds. But even through the Victorian Era and into the 20th century, though there may have been a popular interest in antiquities this was not enough to motivate some property developers to preserve what they might have found in the course of building. Moreover, explains Jeater, the Victorians preserved only what they valued: coffins, urns, and sarcophagi, yes; the bones within them, no. Despite the modern instinct to preserve sites untouched, many artifacts wouldn’t have been found at all if it hadn’t been for the perpetual need to redevelop and to build in a city that can’t stop growing. During Stow’s lifetime, the population of London quadrupled, reaching 250,000 in 1600, one-eighth of the entire population of England. By the time of the Georgians, areas that had once been suburbs of the City were now more or less central and increasingly crowded. With the Industrial Revolution in full swing, the population of the city exploded from 630,000 in the 1700s to 2.32 million people in 1850, making London the largest city in the world. By then, it was nearly 17 miles from end to end, straddling the great river and swallowing up whole villages, but in just the last 100 years, London continued to grow, increasing its population by more than 60 percent. This churn of development makes the job of an archeologist in the city even trickier: “You might have a Roman layer, and bits of medieval dug down into that, then there’s post-medieval and modern things going in, too,” says Jeater. In the middle of the 20th century, the city’s building boom wasn’t only the result of growth—one-in-six London buildings were destroyed during the Blitz in World War II. In the years after the bombings, archaeologists – now more alive than ever to the need to preserve history – scrambled to excavate sites before developers built over them. “It was a really challenging environment,” says Jeater, who in 2017 curated an exhibition of photographs of this period of London archaeology for the Museum. One early archaeologist, Ivor Noel Hume, who later went on to manage the excavation of Colonial Williamsburg, “nearly got wiped out by a crane once.” But those excavations were done on an ad hoc basis. “They were only there due to the goodwill of people doing the construction,” says Jeater. That generosity only stretched as far as was useful for developers: the foundations of Bloomberg’s Mithraeum were actually found in 1954, during the post-war rebuilding of an office block, but developers simply noted the find and then dismantled it, removing it from its original site and for a short time, displaying it on the roof of a parking garage. By 1989, historians and archaeologists had had enough. The discovery of the foundations of Shakespeare’s Rose Theatre on the southern bank of the Thames prompted nationwide protest when it appeared that developers would be razing the grounds. Ultimately, the building was redesigned to accommodate the foundations, but in response to the outcry, Parliament passed legislation the following year requiring developers to plan to manage a site’s history before obtaining permission; if a developer is unable to preserve finds in situ, which is preferred, there must be a plan to preserve them in record or offsite. But, crucially, developers are required to pay for everything, from the site assessments to the excavation itself; most estimates put planning for archaeology at 1 to 3 percent of the development’s total budget. By 2007, 93 percent of all archaeology in the United Kingdom was being paid for by developers. “Archaeology is completely intertwined in the planning process. From a very early point in the project, time has been already allocated for it,” says Marit Leenstra, an archaeologist with the Museum of London Archaeology (MOLA), a charitable organization that conducts archaeological excavations on behalf of developers in and around London (it is no longer affiliated with the Museum of London). In some cases, developers will decide to make their development’s unique archaeological history part of the building. When Bloomberg bought the site of the Mithraeum in 2010, the company decided to reunite the temple with its original location, turn it into a museum space, and employed MOLA to excavate further. This expectation was part of the reason that excavation of the Southwark site, where the Roman sarcophagus was discovered, went so smoothly. It’s also why further excavation of Spitalfieds, where John Stow made off with a human jawbone, was able to recover another Roman sarcophagus, as well as the remains of 126 people, dozens of homes from Stow’s own time, and an 18th-century umbrella manufacturing factory. It’s a process that has worked for more than 25 years and, said Leenstra, has been an inspiration for other European countries, including France, which passed similar “preventative archaeology” legislation in 2001. “I think the rest of Europe is catching up,” she says. Meanwhile, this formal acknowledgement of the importance of preserving the country’s deep history has opened new realms of possibility for research in the city. “It’s about recording as much as you can in that area before it changes, and it’s about opportunity – we wouldn’t be able to dig in the center of London unless a new office building was being built,” explains Jeater. ********* Now, all bones, no matter how small and fragmented, are logged into a database maintained by the Centre for Human Bioarchaeology, part of the Museum of London. By 2014, the database had recorded bones from 16,207 individual skeletons, spanning nearly the entirety of documented human habitation in the London area. And those are just the ones they’ve found. So are the chances good that even now, when you’re walking the streets of London or wandering through one of its parks, you’re probably walking over someone’s grave? “Oh, yes,” says Jeater. Modern archaeology in London demonstrates that the past and the present are never far from each other. And that they need each other – without the need to constantly reinvent this ancient city, archaeologists would never get the chance to see what (or who) is under those office blocks and terraced houses. This has always been the case for a city like London; it’s only now, however, that the need to build is tempered by the inclination to preserve. Linda Rodriguez McRobbie is an American freelance writer living in London, England. She covers the weird stuff for Smithsonian.com, Boing Boing, Slate, mental_floss, and others, and she's the author of Princesses Behaving Badly.
20467897161d2de2a2c629b53771b86b
https://www.smithsonianmag.com/history/deadly-grizzly-bear-attacks-changed-national-park-service-forever-180964462/
The Deadly Grizzly Bear Attacks That Changed the National Park Service Forever
The Deadly Grizzly Bear Attacks That Changed the National Park Service Forever Glacier National Park’s busiest season came to an abrupt halt in the summer of 1967. In a matter of hours, two grizzly bears had acted as they never had before in the park’s 57-year history. Several miles apart, each bear had mauled a young woman on the same day, in the dark, early hours of August 13. Two 19-year-olds, Julie Helgeson, from Minnesota, and Michele Koons, from California, were both asleep under the big sky of northwest Montana, when grizzly bears found them and carried them off. Detailed in National Park Service reports and Jack Olsen’s 1969 book Night of the Grizzlies, these incidents marked Glacier’s first fatal bear maulings. The shocking attacks ushered in a new era for the National Park Service’s management of bears. In Glacier Park and in other parks nationwide, the lessons of that summer live on in warning signs, rules and policies created to avoid repeating the mistakes that led to tragedy 50 years ago. Before then, the park service neglected to close trails where bear sightings were frequent. Littering was common and campsites overflowed with garbage that attracted animals. And in the summer of 1967, as forest fires drove bears further into populated areas, it was clear to some rangers that bears were living dangerously close to people. John Waller, a current supervisory wildlife biologist for the park, says the park service had known for a long time that feeding bears was unsafe. But it wasn’t until after the summer of 1967 that the agency recognized a need for dramatic changes in official park policy. The park quickly overhauled its practices and implemented precautions that are still in use today. “Night of the Grizzlies,” as the events came to be known, “was really the wake up call,” he says. On August 12, 1967, Helgelson and Koons—both 19 at the time—embarked on respective overnight backpacking trips. Both were spending a summer working in one of the park’s lodges, Helgeson in East Glacier Lodge, Koons in West Glacier’s Lake McDonald Lodge. Helgeson’s path was surrounded by vistas of glacial valleys and mountain peaks. Her excursion took her from Logan Pass, roughly eight miles up the popular Highline Trail to the Granite Park Chalet. She and a friend, Roy Ducat, arrived at about 7 p.m., ate their sack dinners and watched the sunset before retiring for the night. Helgeson and Ducat tucked into their sleeping bags outside, near the chalet, packed with guests during the busy summer season. Shortly after midnight, a grizzly bear meandered toward the campers. Ducat would later tell investigators Helgelson had seen the bear and woke Ducat telling him to play dead. The grizzly knocked the pair out of their sleeping bags and within minutes, the bear had sunk its teeth into each of them. It focused on Helgeson, dragging her about 100 yards away. “Someone help us!” she screamed as the bear dragged her off. Ducat, his arm badly mangled, ran to wake other campers nearby. Help arrived for Ducat in the form of a helicopter with medical supplies, but an overly cautious ranger held up the search party, fearful of putting more visitors at risk. Nearly two hours passed before the group departed on its mission to rescue Helgeson. After Ducat was taken to a hospital, and a ranger armed with a rifle had arrived, the group followed a blood trail downhill from the campsite. Soon, they heard a noise and spotted Helgeson facedown, not far below. A doctor staying at the chalet attended to her. “It hurts,” she said several times. The group carried her back to the chalet, where a helicopter would arrive to take her to a hospital. She reached the chalet by 3:45 a.m. but died soon thereafter, minutes before the aircraft landed. As Helgelson departed on her fateful hike, Koons was joining four fellow park employees on a steep eight-mile journey to Trout Lake. A grizzly crashed their campsite at about 8 p.m. as they cooked hotdogs and fresh fish. The campers ran and waited as the bear gobbled up their dinner and scrambled away with one of their backpacks. The party moved their gear, bringing some cookies and Cheez-Its, to the beach. In a ring around a campfire, they settled into their sleeping bags. At about 4:30 a.m., the grizzly reappeared at Koons’ camp. It sniffed around, biting into one of the young men’s sleeping bags and clawing his sweatshirt. One by one, the campers jumped and climbed up trees. From their perches, they yelled at Koons to join them. But before she could, the bear tore into her sleeping bag and began dragging her away. “He’s got my arm off,” the others heard her say. “Oh God, I’m dead,” she said. The party stayed in the trees for about an hour-and-a-half before running down the trail to the nearest ranger station. Seasonal rangers Leonard Landa and Bert Gildart had gone to sleep with knowledge of the Granite Park Chalet mauling. Gildart had heard the calls for help over the radio and helped dispatch emergency responders. Landa stayed awake listening to the radio traffic. When both men learned of the Trout Lake mauling later that morning, they were confused and in disbelief. The rangers were sent to search for Koons. Landa left first with some of Koons’ fellow hikers. Gildart, filled with adrenaline, hurried up the trail to join them. “We were all a little spooked by this time,” Gildart says, reflecting on the events of 50 years ago. “Here’s a bear that’s pulled a girl out of a sleeping bag. What kind of a creature is this?” Minutes after they reached the campsite and fanned out, Gildart recalls Landa whispering, “Bert, here she is.” The young woman’s mutilated body was lifted out of the backcountry by helicopter. The rangers were stunned by the night’s parallel events but not by the problem bears. Landa knew that a bear had been harassing campers at Trout Lake and another nearby camp. And Gildart and seasonal wildlife biologist David Shea had four days prior hiked to Granite Park Chalet to confirm another rumor they’d heard: Bears were feeding nightly on table scraps from boarders at the chalet. “We got up there and we were absolutely astounded that people were standing around throwing food out to the bears,” Gildart recalls. The routine had become a spectacle for visitors. “It was basically an incident waiting to happen,” says Shea, who spent 36 seasons working in the park. The garbage problem wasn’t isolated to Granite Park. Campsites all around Glacier were not well-maintained. Visitors, sloppy with their trash, frequently abandoned it. Gildart later collected 17 bags of garbage from the Trout Lake site. The day after the deadly attacks, Gildart and Landa headed to look for the suspect bear at Trout Lake. Gildart spotted it at 4 a.m. when he stepped outside of the patrol cabin where the men were spending the night. He called for Landa to bring a gun. Within minutes, the bear charged at them and both men fired, killing it. A forensic investigator came to collect the bear. “They had a big knife,” Gildart recalls. “They slit the stomach of this bear, and a big ball of blonde hair came out.” Shea also was on the hunt for the suspect bear at Granite Park Chalet. In total, the park’s staff shot three bears, including the one believed to have killed Helgeson. In his book, Jack Olsen indicted the park service for its irresponsible treatment of bears. Olsen, a journalist and prolific author of true crime books, investigated the killings for a three-part series published in Sports Illustrated. His reporting was republished as Night of the Grizzlies. The best-seller was reprinted in 1996, and visitors will still spot people reading the book in the lobbies of Glacier Park's lodges. “It is pure coincidence indeed that two grizzlies chose a few hours of a single night to take two victims who had much in common,” he wrote, “but it is no coincidence at all that the year in which this happened was 1967, and place Glacier Park.” “It was a lightning bolt right to the core of the whole National Park Service nationwide,” says Waller, the current staffer at Glacier. In the aftermath of the attacks, the park initiated a strict “pack in, pack out” policy. Dumps were eliminated. Rangers ticketed visitors who fed bears and kicked out campers with messy campsites. When grizzlies frequented trails, the areas were closed until the bears moved on. Warnings and tips on bear safety were posted throughout the park. The park set rules for food storage, installed bear-proof trash cans and devised off-the-ground storage for backcountry campers. A new permit system limited the number of campers in the backcountry and required them to sleep in designated campsites, a distance away from cooking areas. The events of August 13 were a pivotal moment, Waller says, giving rise to a “leave no trace” ethic in the outdoors. The result has been increased safety for people and bears, he said. The new practices soon spread to other national parks in which bears lived. By 1970, Yellowstone, the other park in the lower 48 where people were most likely to encounter a grizzly, had enacted many of the same policies. “The tragedy of [that night],” Landa says, “is that two lives were lost.” But Shea adds that the “common sense” precautions that hikers follow today are the good that came of the horror.
a8c552ac0f7ac1e55a467cc35340ae77
https://www.smithsonianmag.com/history/debate-over-rebuilding-ensued-when-beloved-french-cathedral-was-shelled-during-wwi-180971999/
The Debate Over Rebuilding That Ensued When a Beloved French Cathedral Was Shelled During WWI
The Debate Over Rebuilding That Ensued When a Beloved French Cathedral Was Shelled During WWI For nearly a millennium, the French city of Reims was synonymous with its towering Gothic cathedral known as Notre-Dame. Not to be confused with the cathedral sharing the same name in Paris, the Reims church was the heart and soul of the region, its tallest towers rising 265 feet above the city’s 50,000 residents, its resplendent halls used for the coronation of nearly every monarch since the 13th century. But on the eve of the First World War in 1914, the cathedral’s magnificence brought it a different kind of attention: that of an easy target. When the fighting began in August of that year, the invading German army quickly overwhelmed the northeast part of France, including Reims, and transformed the cathedral into an infirmary. They filled the church with 3,000 cots and 15,000 bales of dried grass to use as pallets—all of which remained inside the building after September 4, when the Allied forces of France and the United Kingdom sent the Germans on a rapid retreat after the First Battle of the Marne. With Reims now only a handful of miles from the front, the real destruction began. Five German artillery shells hit the cathedral on September 18, crashing into the medieval structure, but the more devastating attack came a day later. “The projectiles, perhaps incendiary, set afire first the scaffold [around the towers] and then the hay. No more inflammable tinder could have been devised, and no accelerant was required,” writes historian Jan Ziolkowski. Lead from the burning roof poured through the mouths of the church’s stone gargoyles; windows exploded; the Smiling Angel statue that had stood near the front door for centuries lost its head. Unlike the recent fire at Notre Dame de Paris, the assault on Reims Cathedral continued for four years. Around 300 German shells smashed into Notre Dame de Reims after its initial fire; around 85 percent of buildings in the city were destroyed as well. By the end of the war, the famous cathedral was a skeleton of its former self, and a symbol of the incomprehensible brutality of the conflict. * * * From its earliest days, the city of Reims (pronounced rahnce) was a cultural crossroads. As one of the Roman Empire’s largest cities, it hosted merchants from across the continent, and in 496 it also became the center of French Christendom. According to an account written long after the fact, that year marked the baptism of King Clovis. The Frankish leader had already united the surrounding territories into what would become France; now he was transforming the region’s religious landscape. It seemed only fitting that some 700 years later, a massive cathedral would be built on the same spot. The question of when construction began on Notre Dame de Reims has been debated for decades. “There’s this document that talks about a fire and gives a date of 1210,” says Rebecca Smith, an art historian at Wake Tech Community College who has written extensively about the cathedral’s origins. “They don’t mention what burns or how much damage there is, but everybody assumed the cathedral must’ve started construction around 1211 right after the fire.” But recent archaeological analysis by researchers Willy Tegel and Olivier Brun has shown otherwise. They used recovered wood fragments dating all the way back to around 1207 to prove the cathedral was under construction earlier than believed. What no one doubts is the importance of the cathedral from its start. The beginning of the 13th century marked a dramatic increase in the number of Gothic cathedrals being erected. The architectural style was a flamboyant one, with religious buildings adorned by flying buttresses and elaborate decorations. The goal for these churches, Smith says, was “to show off the stained glass, to be taller and thinner and push toward the heavens, toward God.” And since the cathedral at Reims was being erected around the same time as Notre Dame de Paris, an element of competition arose between the cities. But Reims Cathedral secured its place in the religious hierarchy early in its 75-year construction. When a 12-year-old Louis IX was crowned in 1226, he declared that all future monarchs would be coronated at Notre Dame de Reims, harkening back to the history of Clovis as France’s first Christian king. This decree was largely followed for the next 500 years, including a famous episode in 1429 when Joan of Arc fought past opposing forces to bring the French prince to Reims where he could be legitimately crowned Charles VII. The cathedral also survived multiple calamities. In 1481, a fire burned through the roof, and a storm on Easter Sunday in 1580 destroyed one of the great windows. The church even survived the French Revolution of 1789, when the monarchy was temporarily overthrown. The coronation cathedral remained intact despite fighting across the country; citizens recognized its historical importance and couldn’t bear to see it ravaged. These centuries of attachment to the cathedral made its destruction in World War I that much more devastating. Upon returning to Reims after the fighting, French author Georges Bataille wrote, “I had hoped, despite her wounds, to see in the cathedral once again a reflection of past glories and rejoicing. Now the cathedral was as majestic in her chipped and scorched lace of stone, but with closed doors and shattered bells she had ceased to give life… And I thought that corpses themselves did not mirror death more than did a shattered church as vastly empty in its magnificence as Notre-Dame de Reims.” When France passed a law supporting the reconstruction of damaged monuments at the end of the war in 1919, fierce debates erupted over what work should be done on Reims Cathedral. Many argued in favor of leaving it as a ruin. “The mutilated cathedral should be left in the condition in which we have found it at the end of the war,” argued architect Auguste Perret. “One must not erase the traces of the war, or its memory will be extinguished too soon.” According to historian Thomas Gaehtgens, Perret even argued for building a concrete roof over the crumbling cathedral so that all could see the destruction the German army had wrought. But Paul Léon, director of historic preservation at the Ministry of Culture, thought differently. “Does anyone really believe that the inhabitants of Reims could bear the sight of the mutilated cathedral in the heart of their city?” Besides that, the cold and wet climate of Reims would make it exceedingly hard to preserve the ruins. After months of debate and assessments of the damage, reconstruction finally began in late 1919. The Reims Cathedral became a global cause célèbre, and donations poured in from countries around the world. Among the most sizeable donations were several from oil baron John D. Rockefeller, who gave more than $2.5 million (almost $36 million in today’s dollars) to be put towards the reconstruction of several French monuments. By 1927 a large portion of the work was complete, though restoration of the facades, buttresses and windows continued until July 10, 1938, when the cathedral reopened to the public. Much of the cathedral was restored as it had been before the war, though the chief architect overseeing reconstruction, Henri Deneux, was initially criticized for using reinforced concrete rather than wood for the roof. As for the damaged sculptures, some were left has they were, with chips still knocked out. This included gargoyles with solidified lead still dripping from their mouths. As for the famous stained-glass windows, some had been rescued over the course of the war, while many others were remade by artists who referenced other Middle Age artworks, rather than trying to create a pastiche. Of course, the architects and artists working on reconstruction couldn’t have predicted that yet another war would soon engulf the continent. Although the cathedral again suffered some damage during World War II, it received far fewer attacks and remained largely intact. “Cathedrals are living buildings,” says Smith, the art historian. “They’re constantly undergoing cleanings, they’re constantly undergoing restorations and renovations. They’ve always been understood as needing to flex.” For Smith, deciding how to rebuild or restore medieval architecture requires a delicate balance between preserving the past and erasing it to make way for the future. But that is something architects who worked on Notre Dame de Reims have always taken into consideration. As for Notre-Dame de Paris, investigations are ongoing to understand what caused the devastating fire that consumed much of the cathedral’s roof. Construction workers have hurried to prevent any further collapses on the crumbling structure, but more than $1 billion has already been raised to rebuild the Parisian monument. But it’s worth reflecting on the example of the Reims Cathedral, and the knowledge that these medieval marvels were built with an eye toward longevity. They were physical representations of humankind’s attempt to reach the divine from our lowly place on Earth. It’s a sentiment that has survived countless catastrophes—and will likely survive many more. Editor's note, April 19, 2019: This piece has been corrected to note that Rebecca Smith did not contribute to the analysis of the early wooden fragments from the church. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
22030a42c01e8aabc4bcfd11291e15a8
https://www.smithsonianmag.com/history/decoding-lost-diary-david-livingstone-180953385/
Decoding the Lost Diary of David Livingstone
Decoding the Lost Diary of David Livingstone The last decade of David Livingstone’s life did not go well for the famed Scottish missionary and explorer. In 1862, his long-neglected wife, Mary, came to join him in Mozambique, but she quickly contracted malaria and died. Nevertheless, he continued on his mission to find a navigable route through the River Zambezi. But in 1864, seven years before his famous run-in with Henry Morgan Stanley, Livingstone was forced to give up and return to Britain after most of his men abandoned him or succumbed to disease. He quickly fell from public grace as word got out about his failure to navigate the river. Eager to redeem his reputation, he returned to Africa two years later, this time in search of the source of the Nile River. But yet again, his assistants soon began deserting him, and added insult to injury by taking all of his food and medicine with them. Starving and crippled by pneumonia, cholera and cutaneous leishmaniasis, Livingstone had no other choice but to turn to Arab traders for help. But this posed a moral dilemma for the staunch abolitionist: his saviors were the types of men he had been criticizing throughout his professional career for their involvement in the lucrative slave trade in India and the Arab peninsula. From here, the account of what happens next differs depending on whether you read the official version issued by Livingstone’s publisher in 1874, or whether you consult Livingstone’s diary, whose brief entries detailing the period from 1871 to 1873 are, scholars think, a much more honest representation of Livingstone’s true thoughts and experiences. But until very recently, the diary was completely illegible. Having run out of paper and ink, Livingstone used the juice from a local berry to write on an 1869 edition of The Standard newspaper that a friend had sent him (he didn’t receive it until 1871). In 1873, Livingstone died in a small village in Zambia, having succumbed to malaria and dysentery. His diary was shipped back to England along with Livingstone’s body, but as early as 1874, the juice had faded to the point of near-invisibility, and the newspaper’s dark type further obscured efforts to decipher it. So for nearly150 years, Livingstone’s secrets remained firmly locked away on those faded sheets. Adrian Wisnicki, an English professor at the University of Nebraska-Lincoln and a faculty fellow in the Center for Digital Research in the Humanities, first heard about the diary in 2000. Wisnicki trained in the humanities, but his quest to find and decode the diary eventually led him to his true calling, a relatively new field called digital humanities. Digital humanities scholars use computers, technology and social media to address questions in disciplines ranging from literature to history to art. One of the earliest projects to demonstrate the usefulness of this approach was the attempt to decipher the Archimedes Palimpsest, a 10th-century parchment that contained an unknown work by Archimedes. In the 13th century, however, a Christian monk erased the original Archimedes text and reused the paper for transcribing religious text. As the project progressed, however, Archimedes’ lost words were slowly revealed. A team of imaging scientists, information technology consultants and library managers began working on separating the two layers of writing using advanced spectral imaging, a technique that uses separate wavelengths of light to enhance or tone down different chemical signatures—in this case, the ink the original Byzantine scribe used versus that of the monk. This teases those tangled words apart, allowing scholars to read or see what is otherwise invisible to the human eye. The project was a success, revealing not only Archimedes’ “The Method of Mechanical Theorems”—a work originally thought to be lost—but also a formerly lost commentary on Aristotle’s Categories by Alexander of Aphrodisias, and the only known existing manuscript by Hyperides, a 4th century Athenian politician. “Spectral imaging technology is a real game-changer,” says Mike Toth, president of R.B. Toth Associates, the technology company that decoded the Archimedes Palimpsest, along with many other historic documents. “Without it, it’s like trying to read what’s been erased on a white board and then written over. All of that heritage would be lost.” In the years following the Archimedes Palimpset, other methodologies joined the digital humanities’ tool kit, and projects ranged from investigating Thomas Jefferson’s edits on the rough draft of the Declaration of Independence to creating multi-spectral images of the papyrus-based Gospel of Jesus’ Wife. Wisnicki, however, had not yet caught the digital humanities bug. When he went in search of the diary, he was a traditional scholar, trained in the art of research and critical thought, not spectral imaging and metadata collection. In the early 2000s, he was pursuing an interest in 19th century British incursions into Africa, especially the way that British explorers’ rough, honest field diaries were later converted into polished tales of adventure, heroism, danger and discovery that became best-selling books. “The books that came to represent 19th century Africa were often very detached from the actual experiences of individuals in the field,” Wisnicki says. “To some degree, they were as much fiction as they were nonfiction.” For this reason, Wisnicki explains, scholars consider the “original, unbridled, uncensored, written-in-the-heat-of-the-moment notes” as much more trustworthy documentations of what actually took place. The hunt for 19th-century British explorers led him to Livingstone, one of the most famous of that cohort of men—and to rumors about Livingstone’s lost diary. But when Wisnicki finally managed to track down its scattered pages, which were tucked away in several forgotten boxes in the David Livingstone Centre just outside of Glasgow, he found that they were completely unreadable. On a whim, several years after beginning his search, he contacted a friend involved in digital humanities, who directed him to a listserv. Within a day, he had received 30 responses, half of which advised him to reach out to the team behind the Archimedes Palimpsest. On the second day, however, Roger Easton, an imaging scientist from the Institute of Technology who worked on that famous project, contacted Wisnicki himself. “He said, ‘You have a manuscript that might interest us,’” Wisnicki recalls. As it turned out, digital humanities was indeed the solution for transcribing the diary. And more importantly for Wisnicki, his own scholarship would never be the same. Once he embarked down that technologically enriched path, he was hooked. “I started out as a very traditional humanities scholar, looking at archives and books and forming arguments and writing, mostly on my own,” he says. Toth soon got involved, too, and began scanning the pages of the diary, looking for the precise wavelengths that would reveal the writing underneath, and several other experts based in locations ranging from Baltimore to Scotland helped with the post-imaging processing and metadata cataloguing. The project, Toth says, was unique. “We always think in terms of undertext, or that which has been erased or scraped off, but this was a case overtext,” he explains. “Plus, there was this unknown berry ink that posed an interesting challenge.” After subjecting the diary to spectral imaging, the team was left with more than 3,000 raw images, totaling 750 gigabytes of data. All of this needed to be processed by imaging scientists so that the text could actually be read. Easton handled the first phase of processing, which involved a technique called principal component analysis. PCA uses statistics to find the greatest variances between an original text and the spectral images of it. When those images are combined—from most to least variance—they can reveal details lost to the human eye. Easton then handed off nine different PCA images to Keith Knox, an imaging consultant in Hawaii. With those images in hand, Knox was able to crack the legibility puzzle by adding a false color to the pages—light blue, the color that turned out to best mute the printed newspaper text—so that the darker written text stood out. Wisnicki opened up his email one morning to find those pages, an experience that he describes as extraordinary. “It was like history was being made on the screen while I’m sitting there in my pajamas,” he says. In the end, Wisnicki and his colleagues were able to transcribe about 99 percent of Livingstone’s diary. Those words reveal a much more nuanced story than Livingstone’s publisher ever put forth.  “The nice thing about Livingstone is that, compared to some other 19th century writers, his writing is fairly easy to read,” Wisnicki says. The diary begins on March 23, 1871. Forced to team up with the Arab slave traders due to his deteriorating health, Livingstone found—to his dismay—that he was actually beginning to like these men. “The Arabs are very kind to me, sending cooked food every day,” he wrote in April. He told them about the Bible, taught them how to make mosquito nets and drank fermented banana juice with them, which he swore off in the next day’s entry. “They nurse him into health, they become friends,” Wisnicki says. “It’s a very complex relationship.” On the other hand, he soon began to look down on and resent the local people he encountered. Whereas Livingstone had generally had good experiences interacting with locals in the past, this time, he was lumped in with the traders and treated with distrust. He found it impossible to get the help and cooperation he needed to set out on a separate expedition to find the source of the Nile.  “The Manyema are not trustworthy and they bring evil on themselves often,” he complained of the local Bantu tribe. Days turned into the weeks. By June—still lacking a canoe and having declared himself a “victim of falsehood”— Livingstone went so far as to follow the Arabs’ advice and use force to either get his money back from a local chief or to finally get the canoe he was promised. “He’s been out in the field for a long time, and he’s losing contact with reality and becoming more and more desperate to travel,” Wisnicki says. “He starts to take on some of the methods the slave traders use to control the local population.” So Livingstone sent some men to the nearby village with the instructions to “bind and give him a flogging” if the chief still did not cooperate. “On the scale of existing violence in that region at that time, it’s not that significant,” Wisnicki says. “But the fact that Livingstone has taken a step down that path is a big deal.” On July 15, however, Livingstone was abruptly woken from his stupor. The traders—his friends—went into a busy nearby market and began randomly firing guns into the crowd and burning down surrounding villages, killing at least 300 people, many of them women and children. Livingstone had never witnessed such an atrocity before, and he was “crushed, devastated and spiritually broken,” Wisnicki says. In Livingstone’s own words: “I was so ashamed of the bloody Moslem company in which I found myself that I was unable to look at the Manyema. . . This massacre was the most terrible scene I ever saw.” “It’s a wakeup call,” Wisnicki says. “He realizes that he’s started to go the wrong way himself.” Livingstone immediately left the traders and decided to retrace his steps east, bringing him to a village called Ujiji. “He might have been flawed and human, but he was guided by big ideal,” Wisnicki says. “He had a vision.” There, he heard rumors of an Englishman spotted nearby. The diary ends there. Since 1869, no one had received any sort of communication from Livingstone. So James Gordon Bennet, Jr., who published the New York Herald, decided his paper would “find” Livingstone. The story, he knew, would be a hit among readers. So he hired Stanley, a Welsh journalist and explorer, to track down Livingstone. The mission wound up taking two years, but it was a success. A week or two after Livingstone’s diary ends, history tells us that Stanley famously greeted the elusive doctor with the line “Dr. Livingstone, I presume?” “From there, everything changes,” Wisnicki says. Livingstone again becomes the steadfast abolitionist and hero, his flirtation with moral corruption recorded only in the fading pages of his patchwork diary. Additionally, Stanley supplied Livingstone with new notebooks, so he gave up the newspaper and wrote several more diaries before he died two years later. Though none of those diaries pose the same legibility challenges as the newspaper one, Wisnicki is currently transcribing them so that those interested can have a complete picture of Livingstone’s last journey to Africa As for Livingstone, some critics wonder what he would have thought about having his deepest secrets and feelings exposed for all to read, years after his death. “Part of his vision was informing the world about what was happening in Africa with the slave trade,” Wisnicki says. “So I think he would have approved.” Rachel Nuwer is a freelance science writer based in Brooklyn.
8a1eff327241b571713584173190edda
https://www.smithsonianmag.com/history/deep-look-politicians-passed-civil-rights-act-1964-180951799/
A Deeper Look at the Politicians Who Passed the Civil Rights Act of 1964
A Deeper Look at the Politicians Who Passed the Civil Rights Act of 1964 The Civil Rights Act of 1964, a landmark piece of legislation, was a long time in the making, and the passage of the bill required the political machinations of an assortment of Republicans, Democrats, Northerners and Southerners, congressmen, senators, presidents and activists. The photo above, taken by White House press office photographer Cecil Stoughton, shows the wide range of politicans and private citizens it took to guide the Civil Rights Act from a presidential promise to a national law. Congress had considered, and failed to pass, a civil rights bill every year from 1945 to 1957. In 1957, Congress finally managed to pass a limited Civil Rights Act, which it added to in 1960, but these bills offered black Americans only modest gains. It wasn't until 1963, in a televised speech, that President Kennedy called for a robust Civil Rights Act. Kennedy began his address by talking about the two black students who had recently enrolled in the University of Alabama, but needed the presence of Alabama National Guardsmen in order to safely attend classes. "It ought to be possible…for every American to enjoy the privileges of being American without regard to his race or his color. In short, every American ought to have the right to be treated as he would wish to be treated, as one would wish his children to be treated," the president said, noting that while he had recently met with dozens of business leaders in an effort to persuade them to voluntarily adopt measures to end discrimination, he would also bring the matter before Congress. “Next week I shall ask the Congress of the United States to act,” President Kennedy said, “to make a commitment it has not fully made in this century to the proposition that race has no place in American life or law." Eight days later, on June 19, 1963, Emmanuel Celler, a New York Democrat, introduced H.R. 7152—what would become the Civil Rights Act of 1964—to the House of Representatives. But the political fight over the bill's passage was just beginning. Kennedy knew that he would need support from both sides of the aisle to ensure the bill's passage, and wasted no time recruiting allies to his purpose. One such ally was William McCulloch, a Republican congressman from a conservative district in rural Ohio who would become one of the civil rights movement’s most ardent supporters. During President Kennedy’s administration, McCulloch worked with the Democrat-led White House to ensure Republican support of the Civil Rights Act in Congress. Held in August of 1963, the March on Washington was a historic moment for the civil rights movement, and Martin Luther King, Jr., riding the momentum of the occasion, wasted no time turning an eye toward the passage of the comprehensive civil rights bill sitting before Congress. In a piece titled "In a Word—Now," King wrote of the Civil Rights Act as being an integral part of the movement's present fight: "What next? The hundreds of thousands who marched in Washington marched to level barriers. They summed up everything in a word—NOW. What is the content of NOW? Everything, not some things, in the President’s civil rights bill is part of NOW." Celler, who was chairman of the House Judiciary Committee, helped ensure that the bill had favorable hearings at the committee level in the House—perhaps too favorable. Liberal Democrats and Republicans on the committee combined to push the bill in a more liberal direction, calling for a fair employment section that would ban discrimination by private employers, as well as a section that expanded the power of the Attorney General to intervene in Southern civil rights cases. Fearing that the bill would become impossible to pass, Kennedy himself had to intervene, creating a compromise that kept the fair employment section but limited the power of the Justice Department. The bill passed from the House Judiciary Committee to the House Rules Committee on November 20, 1963. But some—both in Congress and the White House—worried that a strong, liberal bill would stand no chance of making it through the legislative process. Others, like Congressman Arch Moore, a Republican from West Virginia, didn't agree, as Moore told the press that if the House sent the Senate "a water bill," the Senate would send back "a water-water bill." On November 22, 1963, President Kennedy was assassinated in Texas, and as the nation mourned the loss of their president, the future of the Civil Rights Act seemed less certain than ever before. The bill’s fate was saved when President Lyndon Johnson decided to throw his full political weight behind its passage. In his address to a joint session of Congress on November 27, 1963 (five days after Kennedy's death), Johnson was resolute, declaring, "We have talked long enough in this country about equal rights. We have talked for 100 years or more. It is time now to write the next chapter, and to write it in the books of law." Still, when the House adjourned in December of 1963, no decision had been made. Using his experience as a former Senate majority leader, President Johnson worked to help petition for the bill to be discharged from the House Rules Committee. The committee's chairman, segregationist Howard Smith from Virginia, had allowed the bill to fester aimlessly. On February 10, the House finally passed the bill. The bill ran into steely opposition in the Senate, facing a 60-day debate and a 14-hour-long filibuster led by Senator Robert Byrd of West Virginia—a Democrat and former member of the Ku Klux Klan. The debate over the Civil Rights Act is still, to this day, the longest debate in Senate history. President Johnson, for his part, helped break the filibuster that kept the bill locked in the Senate by finding ways to compromise with Southern lawmakers. On June 10, the Senate invoked cloture, breaking the filibuster; the bill passed through the Senate shortly after. Nine days later, the Senate approved the Civil Rights Bill, but the bill, having had some changes made to it, had to be sent back to the House for another vote. In a phone conversation two days after the bill made it through the Senate, President Johnson called Rep. Charles Halleck (R-IN), urging the Republican—who was also the House minority leader—to push the bill through. Johnson wanted the bill to be signed into law by July 4—leaving enough time for it to be enacted before the Republican National Convention, which was to begin July 13. On July 2, 1964, the House adopts the Senate’s version of the bill by a vote of 289-126. Natasha Geiling is an online reporter for Smithsonian magazine.
05cbcbe779a6da76cc4a3f8643f0a505
https://www.smithsonianmag.com/history/determining-americas-national-myth-will-determine-countrys-fate-180977067/
The Pitfalls and Promise of America’s Founding Myths
The Pitfalls and Promise of America’s Founding Myths Alexander Hamilton had no illusions about what would happen to Americans if the United States collapsed. If the newly drafted Constitution wasn’t ratified, he warned in Federalist No. 8, a “War between the States,” fought by irregular armies across unfortified borders, was imminent. Large states would overrun small ones. “Plunder and devastation” would march across the landscape, reducing the citizenry to “a state of continual danger” that would nourish authoritarian, militarized institutions. “If we should be disunited, and the integral parts should either remain separated, or … thrown together into two or three confederacies, we should be, in a short course of time, in the predicament of the continental powers of Europe,” he continued. “Our liberties would be a prey to the means of defending ourselves against the ambition and jealousy of each other.” Hamilton’s 1787 plea was successful, of course, in that Americans adopted a new, stronger Constitution two years later. But they still didn't agree on why it was they had come together and what defined them as a people. Maintaining a shared sense of nationhood has always been a special challenge for the United States, arguably the world’s first civic nation, defined not by organic ties, but by a shared commitment to a set of ideals. The U.S. came into being not as a nation, but as a contractual agreement, a means to an end for 13 disparate rebel colonies facing a common enemy. Its people lacked a shared history, religion, or ethnicity. They didn’t speak a language uniquely their own. Most hadn’t occupied the continent long enough to imagine it as their mythic homeland. They had no shared story of who they were and what their purpose was. In short, they had none of the foundations of a nation-state. The one unifying story Americans had told themselves—that they had all participated in the shared struggle of the American Revolution—lost its strength as the Founders’ generation passed from the scene, and had been shaken by secession movements in the Appalachian backcountry of Pennsylvania and Virginia in the 1790s and in New England during the war of 1812. By the 1830s, it had become increasingly clear that this identity crisis could no longer be papered over: Americans knew they needed a story of United States nationhood, if their experiment were to survive. The first person to package and present such a national story for the United States was the historian-statesman George Bancroft. Bancroft, the son of a famous Unitarian preacher in Massachusetts, who graduated from Harvard in 1817 and was promptly sent by that college’s president on an epic study-abroad trip to the German Confederation, another federation of states contemplating its identity. In Europe, Bancroft studied under Arnold Heeren, Georg Hegel, and other intellectuals who were developing ideas of Germanic nationhood; chummed around with Lafayette, Washington Irving, Lord Byron, and Goethe; backpacked on foot from Paris to Rome; and returned home, doctorate in hand, with his head churning with ideas about his country’s place in the world. After failing in bids to be a poet, professor, prep school master, and preacher (who memorably evoked the image of “our pelican Jesus” in a sermon), Bancroft embarked upon what would prove to be his life’s work: giving his young nation a history that would answer those great questions: Who are we? Where did we come from? Where are we going? Bancroft’s vision—laid out over four decades in his massive, 10-volume History of the United States—combined his Puritan intellectual birthright with his German mentors’ notion that nations developed like organisms, following a plan that history had laid out for them. Americans, Bancroft argued, would implement the next stage of the progressive development of human liberty, equality, and freedom. This promise was open to people everywhere: “The origin of the language we speak carries us to India; our religion is from Palestine,” Bancroft told the New York Historical Society in 1854. “Of the hymns sung in our churches, some were first heard in Italy, some in the deserts of Arabia, some on the banks of the Euphrates; our arts come from Greece; our jurisprudence from Rome.” Bancroft’s expansive notion of American identity had questionable aspects, too. He claimed that the Founders were guided by God, that Americans were a chosen people destined to spread across the continent, that success was all but preordained—notions whose hubris and imperialist implications would become clear during his lifetime. But the core of it has remained with us to this day: a civic national vision that defined an American as one devoted to the ideals set down in the Preamble to the Declaration of Independence: equality, liberty, self-government, and the natural rights of all people to these things. Bancroft’s draft of our national myth was taken up and refined by Abraham Lincoln. In the Gettysburg Address, the president presented the myth—“a new nation, conceived in Liberty, and dedicated to the proposition that all men are created equal”—not as our destiny, but as an ideal that had not yet been achieved and, if not fought for, could perish from the Earth. It’s no accident that the definitive copy of the Address is one Lincoln handwrote and sent to Bancroft, who months later was chosen by Congress to deliver the official eulogy for the assassinated president. One had influenced the other. The abolitionist Frederick Douglass—who like Bancroft had traveled to the White House during the war to lobby Lincoln to take a stand for the Declaration’s ideals—carried this civic nationalist torch through the dark days of the 1870s and 1880s. It was a time when Northern and Southern whites agreed to put aside America’s commitments to human equality in favor of sectional unity, even when it meant tolerating death squads in the South and the effective nullification of the 14th and 15th Amendments. “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours,” Douglass said in an 1869 speech that summarized U.S. civic nationalism as well as anyone ever has. “We shall spread the network of our science and civilization over all who seek their shelter… [and] all shall here bow to the same law, speak the same language, support the same Government, enjoy the same liberty, vibrate with the same national enthusiasm, and seek the same national ends.” Douglass, who had escaped from slavery, was, unlike Bancroft, well aware that America had not implemented its ideals and that it was not at all inevitable that it ever would. That made his framing of the task and its stakes far more compelling, accurate, and ultimately inspirational than the bookish and often oblivious historian’s. But Bancroft’s vision of American civic cohesion was not the only national narrative on offer from the 1830s onward, or even the strongest one. From the moment Bancroft articulated his ideas, they met a vigorous challenge from the political and intellectual leaders of the Deep South and Chesapeake Country, who had a narrower vision of who could be an American and what the federation’s purpose was to be. People weren’t created equal, insisted William Gilmore Simms, the Antebellum South’s leading man of letters; the continent belonged to the superior Anglo-Saxon race. “The superior people, which conquers, also educates the inferior,” Simms proclaimed in 1837, “and their reward, for this good service, is derived from the labor of the latter.” Slavery was endorsed by God, declared the leading light of the Presbyterian Church of the Confederacy, Joseph Ruggles Wilson, in 1861. It was one of many Anglo-Saxon supremacist ideas he imbued on his loyal son, Woodrow. The younger Wilson spent the 1880s and 1890s writing histories disparaging the racial fitness of Black people and Catholic immigrants. On becoming president in 1913, Wilson segregated the federal government. He screened The Birth of a Nation at the White House—a film that quoted his own history writings to celebrate the Ku Klux Klan’s reign of terror during Reconstruction. Simms, the Wilsons, and Birth of a Nation producer D.W. Griffith offered a vision of a Herrenvolk democracy homeland by and for the dominant ethnic group, and in the 1910s and 1920s, this model reigned across the United States. Confederate monuments popped up across former Confederate and Union territory alike; Jim Crow laws cemented an apartheid system in Southern and border states. Directly inspired by the 1915 debut of The Birth of a Nation, a second Klan was established to restore “true Americanism” by intimidating, assaulting, or killing a wide range of non-Anglo Saxons; it grew to a million members by 1921 and possibly as many as 5 million by 1925, among them future leaders from governors to senators to big-city mayors, in addition to at least one Supreme Court Justice, Hugo Black. The Immigration Act of 1924 established racial and ethnic quotas devised to maintain Anglo-Saxon numerical and cultural supremacy. This ethno-nationalist vision of our country was dethroned in the 1960s, but it remains with us, resurgent, today. Its strength can’t be underestimated: Simms’s vision is as old and as “American” as Bancroft’s, and it was the dominant paradigm in this country for nearly as many decades. It will not just slink off into the night. It must be smothered by a more compelling alternative. The civic nationalist story of America that Bancroft envisioned still has the potential to unify the country. Its essential covenant is to ensure freedom and equality of opportunity for everyone: for African Americans and Native Americans—inheritors of the legacies of slavery and genocide—to be sure, but also for Americans with ancestors from Asia and Latin America, India and China, Poland, France, or Ireland. For rural and urban people; evangelicals, Jews, Muslims, and atheists; men, women, nonbinary people, and, most certainly, children. It’s a coalition for Americans, a people defined by this quest, tasked by the preamble of the Constitution to promote the common good and individual liberty across generations. Over the past century, cultural, judicial and demographic changes have strengthened its hand, ending white Christian control over the electorate in all the large states, not a few of the small ones, and in the federation as a whole. It’s not an off-the-shelf product, however. Its biggest failings—arrogance, messianic hubris, a self-regard so bright as to blind one to shortcomings—stem from the Puritan legacy Bancroft was so steeped in. The Puritans thought they had been chosen by God to build a New Zion. Bancroft believed the product of their mission was the United States, and that it was destined to spread its ideals across a continent and the world. This notion of American Exceptionalism—that the U.S. can walk on water when other nations cannot—needs to be jettisoned and replaced by the humility that comes with being mere mortals, able to recognize the failures of our past and the fragility of our present and future. It’s a task that will take a generation, but could bring Americans together again, from one shining sea to the other. Colin Woodard is a journalist and historian, and the author of six books including Union: The Struggle to Forge the Story of United States Nationhood. He lives in Maine.
4b2823b5d50d2fa4350b2b72b29918bf
https://www.smithsonianmag.com/history/divided-loyalties-107489501/
Divided Loyalties
Divided Loyalties The invitation arrived with a question: “Since we’ll be dining in the 18th century,” it read, “would you mind wearing a British Redcoat? Also, you’ll be expected to swear loyalty to King George. I hope this won’t be a problem.” A week later, I found myself inside a drafty Gothic church in the center of Saint John, New Brunswick, surrounded by dozens of costumed historical reenactors, each channeling the personality of a long-dead Tory or Hessian. They had come from all over Maritime Canada—the Atlantic Seaboard provinces of New Brunswick, Nova Scotia and Prince Edward Island—to celebrate the 225th anniversary of DeLancey’s Brigade, one of 53 Loyalist regiments that fought alongside the British during America’s Revolutionary War. Up from Shelburne, Nova Scotia, came the Prince of Wales American Regiment. The Royal American Fencibles crossed the Bay of Fundy from Yarmouth. So did officers from the Kings Orange Rangers in Liverpool. Amid the rustle of women’s petticoats and the flash of regimental swords, they greeted a cast of characters straight out of Colonial America: a quietly earnest parson garbed in black, wearing the swallow-tailed collar of an Anglican cleric, and a buckskinned spy with the British Indian Department, who confided he was busy organizing Iroquois raids on the Continental Army. Seated at a table groaning under the weight of 18th-century-style comestibles—a tureen of turnip soup made from a 1740 recipe; a bowl of heirloom apples not sold commercially in more than a century; and a marzipan dessert shaped to resemble a hedgehog—it was easy to slip into a parallel universe. At this regimental gathering, there was no discussion of the war on terrorism. Instead, we lamented General Burgoyne’s blunder at the Battle of Saratoga in 1777 and congratulated ourselves on how well Loyalists were fighting in the Carolinas. “These clothes just feel right,” whispered military historian Terry Hawkins, a red-coated lieutenant colonel, amid a chorus of huzzahs offered to George III. “I belong in this scene.” Unlike many Civil War aficionados, who even today bear the burden of the Confederacy’s lost cause, Canadian Tories are sanguine about the outcome of their war: the British defeat, to their way of thinking, ensured that they escaped the chaos of American democracy. “After Harold and I participated in a reenactment of the Battle of Bunker Hill, we took the kids out to Cape Cod for a swim,” remembers a smiling Wendy Steele, who wore a voluminous, hoop-skirt gown of the kind popular in the 1780s. “They paraded along the beach shouting, ‘George Washington is rebel scum.’ What a marvelous vacation it was!” When the minstrels had finished singing “Old Soldiers of the King” and launched into “Roast Beef of Old England,” I returned the borrowed trappings of empire and strolled down Charlotte Street through the late summer twilight. Ahead lay the old Loyalist burial ground; the corner where Benedict Arnold once lived; and King’s Square, whose diagonal crosswalks are arrayed to resemble a Union Jack. To the right loomed TrinityChurch, spiritual successor of the Lower Manhattan structure abandoned by its Anglican congregation following Britain’s defeat in 1781. Inside the silent church, gray stone walls covered with chiseled plaques commemorate those “who sacrificed at the call of duty their homes in the old colonies.” The plaques told a story of loss and removal. Somewhere inside the sacristy lay a silver communion chalice bestowed upon Saint John’s founders by George III. But high above the nave hung what is surely the church’s most highly valued treasure: a gilded coat of arms—the escutcheon of Britain’s Hanoverian dynasty—that once adorned the Council Chamber of the Old State House in Boston. “We grew up with the knowledge that our ancestors were refugees who had been robbed and tortured because of their loyalty,” says Elizabeth Lowe, a fifth-generation descendant of Benedict Arnold’s cousin Oliver. “We may have learned to accept the Americans, but we will never forget our history.” Schools teach American children that our revolutionary struggle was a popular uprising against heavy-handed taxes and self-serving imperialism. But the fight for independence was also a bloody civil war in which perhaps one out of five Americans preferred to remain a British subject. Massachusetts and Virginia undoubtedly were hotbeds of revolt, but New York, Georgia and the Carolinas contained sizable populations loyal to the Crown. “Rebels gained control of New England early in the war,” says historian John Shy, professor emeritus at the University of Michigan. “Americans who mistrusted New England never embraced the Revolution, and neither did Indians on the frontier who thought independence would lead to further encroachment on their land. The bloodiest fighting occurred in the Carolinas where the populations were equally divided.” Divisions within Colonial society extended into even the founding fathers’ families. Benjamin Franklin’s son William defied his father and remained Royal Governor of New Jersey until his arrest in 1776. (After his release in 1778, William eventually fled to England; he and his father were forever estranged.) George Washington’s mother and several of his cousins, not to mention Virginia’s influential Fairfax family, were Tory. John Adams and John Hancock both had in-laws outspokenly loyal to King George. Several delegates to the Continental Congress were related by marriage to active Tories. “All families are liable to have degenerate members,” declared New Jersey delegate William Livingston upon the arrest of his nephew. “Among the twelve apostles, there was at least one traitor.” To keep Tories (a derisive 17th-century term first applied by English Puritans to supporters of Charles II that came to define people who disagreed with the Revolution) in line once the Declaration of Independence was signed, most states enacted restrictive “Test Acts” that required their citizens to formally denounce the British Crown and swear allegiance to his or her resident state. Those who failed to take the oath were subject to imprisonment, double and triple taxation, confiscation of property and banishment. Neither could they collect debts, buy land or defend themselves in court. Connecticut made it illegal for these Loyalists to criticize Congress or the Connecticut General Assembly. South Carolina required supporters of the Crown to make reparations to victims of all robberies committed in their counties. Congress quarantined the entire population of Queens County, New York, for its reluctance to join patriot militias. Many in the Continental Congress defended the Test Acts, arguing that money from the sale of confiscated property could be used to buy Continental loan certificates—war bonds of the day. George Washington described fleeing Tories as “unhappy wretches” who “ought to have . . . long ago committed suicide.” When one of his generals tried to put a stop to physical violence directed against Loyalists, Washington wrote that “to discourage such proceedings was to injure the cause of Liberty in which they were engaged, and that nobody would attempt it but an enemy to his country.”  Anti-Tory sentiment was especially intense in Massachusetts. When 1,000 Loyalists fled Boston along with British general William Howe in March 1776, Colonists sang: The Tories with their brats and wives Should fly to save their wretched lives. Though neither side was blameless when it came to gratuitous cruelty, probably no combatants suffered more than those in Loyalist regiments. British, Hessian and American officers all loosely adhered to an accepted code of conduct that held that soldiers were prisoners of war who could be exchanged or released on parole if they promised to refrain from further fighting. But Tories were viewed as traitors who, if caught, could be banished to the frontier, imprisoned indefinitely or executed. “In this war,” one Tory sympathizer would write, “only those who are loyal are treated as rebels.” After the October 1780 battle at Kings Mountain, South Carolina, in which nearly 200 Tory militiamen died, victorious patriots lynched 18 Loyalists on the battlefield, then marched the remaining prisoners north. After a week on the road, the starving, ragtag procession had traveled only 40 miles. To speed up the pace, patriot officers summarily convicted 36 Tories of general mayhem and began stringing them up three at a time. After nine Tories were hanged from the limb of an oak tree, the killing was halted, to the distress of one colonial who remarked, “Would to God every tree in the wilderness bore such fruit as that.” Curiously, Tories suffered even at the hands of British officers who, for the most part, dismissed them as ignorant provincials. The British especially distrusted Loyalist militia regiments, claiming that they were slow to follow orders and often went off on their own to seek revenge against those who had destroyed their property. This contemptuous attitude may explain why Lord Cornwallis, when he surrendered at Yorktown in 1781, yielded to Washington’s demand that Tories be turned over to victorious Continental soldiers as prisoners of state, not war, thus allowing them to be executed as traitors. As the British sloop Bonetta set sail from Yorktown, hundreds of Tories frantically rowed after the departing ship. All but 14 were overtaken and brought back to shore. Nearly two more years would pass before the Treaty of Paris was signed and the British departed from the United States. Much of the delay resulted from disagreements about what to do with the Tories. During treaty negotiations in France, British officials wanted all property and full legal rights returned to those who had been dispossessed. American negotiators adamantly refused. In the end, the treaty stipulated that Congress would “earnestly recommend” that “the legislatures of the respective states” curtail persecution and that Loyalists be given 12 months to reclaim their property. But Congress had no power to enforce the provisions, and Britain lacked the will to ensure compliance. As one cynical Loyalist wrote: Tis an honor to serve the bravest of nations And be left to be hanged in their capitulations. By the spring of 1783, a massive refugee exodus was under way. At a time when the total population of America was about 2.5 million, an estimated 100,000 Tories, up to 2,000 Indians, most of them Iroquois, and perhaps 6,000 former slaves were forced to leave the country. The Iroquois crossed into Canada. Many slaves who had agreed to fight for Britain, in return for a promise of freedom, went to Nova Scotia; many of them later immigrated to Sierra Leone. Several thousand Tories moved to the Bahamas. Another 10,000 settled in Jamaica and the rest of the British West Indies. Florida, then a British possession, was swamped with new  arrivals, as was Ontario, then known as Upper Canada. But the largest number, perhaps as many as 40,000 in all, headed for the British colony of Nova Scotia. Newly independent Americans scoffed at the notion that anyone would willingly live in “Nova Scarcity.” One Tory refugee described the colony as a land “covered with a cold, spongy moss, instead of grass,” adding that “the entire country is wrapt in the gloom of perpetual fog.” But Nova Scotia was not without its virtues. Largely uninhabited, the colony, roughly comprising present-day New Brunswick and Nova Scotia, plus part of what is now Maine, was covered by virgin forest, a considerable resource given that all ships were constructed of timber. Just off the coast, the Grand Banks was the most fertile fishing ground in the world. But the most important advantage accrued from Britain’s Navigation Act, which required trade between its Atlantic dominions to be carried in British or colonial vessels. Let America look west to its new Mississippi frontier. Nova Scotia’s displaced merchants would soon monopolize commerce with the West Indies. “It is, I think, the roughest land I ever saw,” wrote Stamford, Connecticut’s Sarah Frost upon arriving at the mouth of the St. John River early in the summer of 1783. “We are all ordered to land tomorrow, and not a shelter to go under.” Others viewed their exile in even bleaker terms. Noted one Loyalist: “I watched the sails disappearing in the distance, and such a feeling of loneliness came over me that although I had not shed a tear through all the war, I sat down on the damp moss with my baby on my lap, and cried bitterly.” Despite the dislocation angst, Nova Scotia grew rapidly over a 12-month span. Within a few months, the port of Shelburne on Nova Scotia’s south coast had 8,000 residents, three newspapers and was well on its way to becoming the fourth-largest city in North America. After observing the diversity of talent in the region’s growing population, Edward Winslow, a Tory colonel from Massachusetts who later became a judge in New Brunswick, predicted, “By Heaven, we will be the envy of the American states.” Some Loyalist leaders wanted to replicate 18th-century England, in which the rich lived off large estates with tenant farmers. “But most of the new arrivals were infected with America’s democratic ideals,” says Ronald Rees, author of Land of the Loyalists. “Nobody wanted to be a tenant farmer anymore. More than a few Tories condemned ‘this cursed republican town meeting spirit.’ ” By the mid-19th century, Britain had begun eliminating trade protections for Maritime Canada, thereby putting these colonies at a disadvantage relative to its much more developed American states. “Britain’s embrace of free trade was the killer blow,” says Rees. “By 1870, steam had replaced sail, and all the best lumber had been cut. Once all the timber was gone, the Loyalists had nothing the British wanted.” Inside new Brunswick’s provincial legislature, enormous portraits of George III, whose erratic behavior eventually gave way to insanity, and his wife, the self-effacing Queen Charlotte, dominate a chamber that replicates Britain’s House of Commons. And the image of a British galleon, similar to those that carried Loyalists from America, adorns the provincial flag. Beneath the ship floats New Brunswick’s resolute motto: Spem Reduxit (Hope Restored). “There is no place on earth more loyal than here,” says historian Robert Dallison, as he ambles through Fredericton’s Old Public Burial Ground, past tombs whose weathered epitaphs relate a story of unvarying defiance and privation. Leaving the cemetery, Dallison drives down to the St. John River and turns onto Waterloo Row. On the left, a number of stately properties stand on land first developed by Benedict Arnold. On the right, down a gravel road past an overgrown softball field, several stones in a pool of mud mark the anonymous graves of starved Loyalists hastily buried during the harsh winter of 1783-84, a period Maritime history books call “the hungry year.” Maritime Canada’s living monument to its Loyalist past lies just north of Fredericton at Kings Landing, a 300-acre historical settlement that comes alive each summer when 175 costumed employees work in and about 100 relocated homes, barns, shops and mills that once belonged to Loyalists and their descendants. At Kings Landing, it’s possible to sample a hearth-baked rhubarb tart, observe the making of lye soap and learn how to cure a variety of maladies from Valerie Marr, who in her role as a colonial healer, tends what appears to be a sprawling patch of weeds. “A Loyalist woman needed all these plants if she expected her family to survive,” Marr says. “Butterfly weed cures pleurisy. Tansy reduces arthritic pain if it’s mixed with a bit of vinegar.” Marr, who is 47, has worked at Kings Landing for 26 years. “I tell my friends that I’ve spent half my life in the 19th century,” she says with a laugh. Kings Landing gardeners grow heirloom fruits, flowers and vegetables in demonstration plots and work with CornellUniversity to preserve a variety of apples no longer sold commercially. Various traditional species of livestock, including Cotswold sheep, are bred here as well. “Kings Landing is a living portrait of a society striving to regain what it lost in the American Revolution,” says chief curator Darrell Butler. “We’re re-creating history.” No less a luminary than England’s Prince Charles attended the 1983 bicentennial celebration of the Penobscot Loyalists’ mass migration to Canada. “I was wearing my United Empire Loyalist pin when I met Charles,” sighs retired teacher Jeannie Stinson. “I told him that everybody in my family is a Loyalist. He smiled and told me that I didn’t look 200 years old.” America’s Tories were among the British subjects who transformed Canada, which was largely French territory until 1763, into an English-speaking country. Today some 3.5 million Canadians—more than 10 percent of the country’s population—are direct descendants of Americans on the losing side of the Revolutionary War. But the world moves on. Memories fade, values morph, new people arrive. For more than two centuries, Saint John, New Brunswick, proclaimed itself the LoyalistCity, and schools were dismissed and merchants donned colonial garb when Saint John annually memorialized the arrival of Sarah Frost and her fellow Tories. Today, however, Saint John styles itself as “The Fundy City” and celebrates the ebb and flow of the Bay of Fundy’s tides, to the dismay of some. “What exactly is a ‘FundyCity?’ ” grumps Eric Teed, an Anglophile barrister who is the former president of the New Brunswick chapter of United Empire Loyalists (UEL). “Saint John is the LoyalistCity, but now there’s all this cultural competition for heritage marketing.” To keep their ancestors’ accomplishments from being forgotten, in 2001 the UEL published a curriculum aid for history teachers entitled The Loyalists: Pioneers and Settlers of the Maritimes. “We distributed it free of charge to all of the schools, but I don’t think it is being used,” says Frances Morrisey, a UEL descendant of one of New Brunswick’s founding fathers. “Loyalists gave Canada peace, order and good government, but now they’re being forgotten.” Saint John’s mayor, Shirley McAlary, sees no cause for concern. “There are a lot of new people living here who have no connection to the UEL,” she says. “The Loyalist people are growing older and their children are leaving. Now it’s the Irish who are stronger and more united. It’s hard to keep history alive if it doesn’t change.” In the nearby town of Liverpool, on Nova Scotia’s rocky Atlantic shore, history needs no re-creation. On the anniversary of George III’s birthday, John Leefe, whose Huguenot ancestors were forced to flee Mount Bethel, Pennsylvania, 220 years ago, bivouacs with the Kings Orange Rangers, a re-created regiment of 50 historical reenactors formally recognized by the British government. And each summer Leefe, who is mayor of the surrounding municipal region, presides over Privateer Days, a community gala celebrating Loyalist pirates who raided U.S. shipping following the Revolutionary War. “My own family was living in America 100 years before the Revolution even began. Perhaps that is why I use every occasion to toast King George,” Leefe says with a smile. “Canada is a mosaic, not a melting pot, and that allows people to remember their family history,” he adds. “Loyalists still view the United States as a dysfunctional family we just had to leave.”
40afb8101cc5138d669a5536b3335e34
https://www.smithsonianmag.com/history/doctor-feelgood-143267828/
Doctor Feelgood
Doctor Feelgood Mel Gibson did it. Brooke Shields too. So did Uma Thurman, Ben Stiller and Carrie Fisher. They and dozens of other celebrities have all come forward, in books or on TV, to discuss their struggles with alcoholism, or drug addiction, or postpartum depression, or other long dark nights of the soul. Quite possibly, misery has never loved company more than in American pop culture right now. So strong is our preference for redemptive narratives of adversity overcome that after James Frey's purported memoir A Million Little Pieces was revealed to contain a pack of fabrications, it returned to the New York Times nonfiction bestseller list for an encore appearance. Samuel Johnson was no Mel Gibson, but his biography includes the makings of a modern celebrity sobfest: birth into poverty; a host of ailments, both physical and psychological; and, of course, the burdens of fame. In his time (1709-84), Dr. Johnson was a renowned critic, biographer, moral philosopher and creator of A Dictionary of the English Language. He was also known to be a bit strange. But in his moments of crisis, he issued no statements through his publicist (or his protégé and future biographer, James Boswell), and he declined to retreat into solitude; instead, he fashioned his own recovery, in ways that anticipate popular currents in contemporary psychology. Johnson went on to write about happiness and melancholy, joining a larger Enlightenment dialogue on those topics among such luminaries as Voltaire, Diderot, Rousseau and Jeremy Bentham. (Like our own time, the 18th century was preoccupied with the idea of happiness.) His writings don't provide the drama of, say, addiction-induced kleptomania, but they do offer a refreshing contrast to the current template for melodramatized suffering and contentment. With diligent effort and keen insight into the workings of the mind, Johnson simply figured out how to work around his afflictions and make himself happy. He started out with the odds against him. "I was born almost dead and could not cry for some time," he recalled late in life. In infancy, scrofulous lymph nodes were found in his neck and attributed to the tuberculosis of his wet nurse. He was transported to Queen Anne's presence in the belief, common at the time, that the royal touch could cure "the King's Evil," as scrofula was called. All his life he had poor vision and hearing. Bizarre tics, odd vocalizations ("too too too," he muttered when excited) and wild gestures rendered his appearance, one observer said, "little better than that of an idiot." But Johnson was a precocious lad. He read prodigiously, mastered Latin ("My master whipt me very well," he told Boswell) and was so helpful to his fellow students that they carried him to school in gratitude. Neurologists now believe that Johnson's convulsions and odd behavior were symptoms of Tourette's syndrome, a disorder first identified in 1885 by George Gilles de la Tourette. Johnson's contemporaries left vivid accounts of its effects on him: "His vast body is in constant agitation, see-sawing backwards and forwards, his feet never a moment quiet; and his whole great person looked often as if it were going to roll itself, quite voluntarily, from his chair to the floor," wrote Fanny Burney, the English diarist and novelist. Frances Reynolds, sister of the painter Sir Joshua Reynolds, recorded the curious method by which Johnson led a blind member of his household through a doorway: "On entering Sir Joshua's house with poor Mrs. Williams...he would quit her hand, or else whirl her about on the steps as he whirled and twisted about to perform his gesticulations; and as soon as he had finished, he would give a sudden spring, and make such an extensive stride over the threshold, as if he was trying for a wager to see how far he could stride." As if his oddness were not enough, Johnson inherited from his father, Michael Johnson, what he called a "vile melancholy," which, he confided to Boswell, made him "mad all his life." Johnson's first major depressive episode occurred at age 20 while he was on vacation from Oxford, where he was an impoverished but extremely well-read student. Johnson, Boswell wrote, "felt himself overwhelmed with an horrible hypochondria, with perpetual irritation, fretfulness, and impatience; and with a dejection, gloom and despair, which made existence misery." But even in this early period, Johnson exhibited a genius for self-analysis. He wrote up his own case in Latin and gave it to his physician and godfather, Dr. Samuel Swinfen. The doctor was "so much struck with the extraordinary acuteness, research, and eloquence of this paper," writes Boswell, "that in his zeal for his godson he shewed it to several people." Naturally, Johnson was furious. The gloom lifted, and it may be just as well that Johnson didn't seek further medical help after the gross violation of doctor-patient confidentiality. The preferred treatments for melancholy in his time were purges, emetics, bleedings and physical punishment. Johnson prepared to manage his own case, a contemporary noted, by studying medicine "diligently in all its branches," giving "particular attention to the diseases of the imagination." His greatest fear was that he might lose his reason, for it was his powerful intellect that allowed him to keep a grip on sanity. "To have the management of the mind is a great art," he told Boswell, "and it may be attained in a considerable degree by experience and habitual exercise." Johnson would have agreed wholeheartedly with the sentiment of the Greek philosopher Epictetus, who wrote: "People are not disturbed by things, but by the view they take of them." This is the idea at the heart of cognitive-behavioral therapy, a pragmatic, short-term form of psychotherapy now widely used to treat a host of psychological problems. Cognitive-behavior therapists believe that emotional disturbances are caused by "distortions in thinking," erroneous beliefs or interpretations that can trigger anxiety, depression or anger. Take a patient who tells himself: "I got a parking ticket; nothing turns out well for me." Cognitive-behavior therapists refer to this as "catastrophic thinking." It is the therapist's task to help the patient replace such distortions with more realistic interpretations, as in, "It's too bad I got a ticket, but it's a small matter in the scheme of things." Johnson sometimes played cognitive-behavior therapist to the fretful Boswell. On one such occasion, Boswell arrived at Johnson's London home upset and uneasy. He'd had a run-in with his landlord and resolved not to spend another night in his rooms. Johnson laughed. "Consider, Sir, how insignificant this will appear a twelvemonth hence." This insight made a big impression on Boswell. "Were this consideration to be applied to most of the little vexatious incidents of life, by which our quiet is too often disturbed, it would prevent many painful sensations," he wrote. "I have tried it frequently, with good effect." Johnson often touched on psychological matters in The Rambler, a twice-weekly pamphlet that he published between 1750 and 1752. Typical is Rambler #29, in which he used cool reasoning and striking imagery to show the folly of catastrophic thinking about future misfortunes. "Whatever is afloat in the stream of time, may, when it is very near us, be driven away by an accidental blast, which shall happen to cross the general course of the current." He believed that idleness provided fertile ground for the melancholy that threatened to consume him. "It is certain that any wild wish or vain imagination never takes such firm possession of the mind, as when it is found empty and unoccupied," he wrote in Rambler #85. He formulated and lived by a simple mantra: "If you are idle, be not solitary; if you are solitary, be not idle." A childless widower in midlife—his wife, Tetty, more than 20 years his senior, died in 1752—Johnson gathered an odd household of characters that became a kind of surrogate family for him. There was his young servant, Frank Barber; the blind Welsh poetess Anna Williams, whose habit of using her finger to judge how much tea to pour in a cup offended Boswell; Robert Levett, a dissolute physician to the poor, and later the penniless widow Elizabeth Desmoulins, the hapless Dr. Swinfen's daughter. They were a motley lot, but he was fond of them. Johnson also gathered a wide support network of friends throughout London society. He filled his evenings with an endless round of dinner parties and was a founding member of the famous Literary Club—Edmund Burke, Joshua Reynolds, Oliver Goldsmith and Boswell were members—in which he found sociability, amusement and a forum for displaying his rhetorical skills. "There is no arguing with Johnson," Goldsmith observed, "for when his pistol misses fire, he knocks you down with the butt end of it." He loved to talk and to eat, but "most important of all," wrote biographer Joseph Wood Krutch, Johnson "won hours of freedom from his own sick mind." But he could not escape solitude entirely. When alone he sought, as Boswell put it, "constant occupation of mind." Naturally, he was a voracious reader. He was also an enthusiastic amateur chemist, often befouling his rooms with noxious fumes. He engaged in a variety of nonchemical experiments, too, once shaving the hair around his right nipple in order to observe how long it took to grow back. A diary entry for July 26, 1768, reads: "I shaved my nail by accident in whetting the knife, about an eighth of an inch from the bottom, and about a fourth from the top. This I measure that I may know the growth of nails." Johnson's various investigations provided occasions for what psychologist Mihaly Csikszentmihalyi calls the "autotelic experience," or "flow," a state in which the individual has "intense emotional involvement" in a rewarding, goal-directed activity. Flow "lifts the course of life to a different level," Csikszentmihalyi writes. "Alienation gives way to involvement, enjoyment replaces boredom, helplessness turns into a feeling of control, and psychic energy works to reinforce the sense of self, instead of being lost in the service of external goals....Concentration is so intense that there is no attention left over to think about anything irrelevant, or to worry about problems." What saved Johnson, time and again, was his ability to step back and view his illness objectively, a talent he exhibited notably when he suffered a stroke near the end of his life. He described the episode in a letter to a friend: "I went to bed, and in a short time waked and sat up, as has been long my custom, when I felt a confusion and indistinctness in my head, which lasted, I suppose, about half a minute. I was alarmed, and prayed God, that however he might afflict my body, he would spare my understanding. This prayer, that I might try the integrity of my faculties, I made in Latin verse. The lines were not very good, but I knew them not to be very good: I made them easily, and concluded myself to be unimpaired in my faculties." He never lost his reason or his zest for human connection. And he kept a clear vision of what would keep him happy: "If...I had no duties, and no reference to futurity," he told Boswell, "I would spend my life in driving briskly in a post-chaise with a pretty woman; but she should be one who could understand me, and would add something to the conversation." John Geirland, a writer based in Los Angeles, has a doctorate in social psychology.
85938900d085740e02846d30310a6c31
https://www.smithsonianmag.com/history/document-deep-dive-a-firsthand-account-of-the-hindenburg-disaster-79086828/
Document Deep Dive: A Firsthand Account of the Hindenburg Disaster
Document Deep Dive: A Firsthand Account of the Hindenburg Disaster On May 6, 1937, Frank Ward cut school at noon and hitchhiked to the Naval Air Station in Lakehurst, New Jersey. His father, Peter Ward, was an engineer at the base. So, by extension, 17-year-old Frank, a brawny baseball player and track star at Seton Hall Prep, was regularly recruited to help pull down incoming airships. [×] CLOSE Video: The Lost Map of the Hindenburg [×] CLOSE Photo Gallery This particular day, the Hindenburg was scheduled to land. The 804-foot-long dirigible, built by the Zeppelin Company of Friedrichshafen, Germany, had left Frankfurt just three days prior for its first transatlantic flight of the season. In 14 months of operation, the zeppelin had logged 63 trips. Ward served on the ground crew for five of the Hindenburg’s landings at Lakehurst the previous year and expected this one to go as smoothly as the rest. Around 7 p.m., the airship edged over the pine-tree horizon and, after some maneuvering, drifted in for landing. Ward and a cluster of other linemen grabbed one of several landing ropes tossed out of the zeppelin. They ran to tighten the slack in the rope, but before they were able to dock the ship to its mooring mast, disaster struck. The Hindenburg suddenly burst into flames. In just 34 seconds, the vessel crashed to the ground, the future of lighter-than-air commercial travel grinding to a fiery halt with it. While 62 passengers and crew escaped from the smoldering wreckage, 35 perished, as did one ground crew member. Cheryl Ganz, a leading Hindenburg historian, has spent the better part of her career resurrecting stories from salvaged artifacts. Her primary interest is in zeppelin mail; 360 of the 17,609 pieces of mail aboard the Hindenburg were recovered from the crash site. But her research has also turned up never-before-published photographs taken from the Hindenburg, as well as other documents. While preparing “Fire and Ice: Hindenburg and Titanic,” an exhibition at the Smithsonian’s National Postal Museum, Ganz studied 1,278 pages of Hindenburg testimony in 14 file boxes held at the National Archives in College Park, Maryland. Amid the documents, she found a form dated May 17, 1937, filled out by Ward. The Bureau of Air Commerce’s inquiry board, tasked with investigating the cause of the accident, asked the lineman ten questions about what he saw and heard as the tragedy played out. “The board collected these little descriptions and eyewitness accounts before the actual investigation,” says Ganz. “Then, based on what the witnesses saw or what their knowledge level was, they were called for testimony.” Ward was not plucked for oral testimony. But, 75 years later, his story has become ever more precious. At age 92, Ward, who now resides in Charlottesville, Virginia, is one of the only living members of the Hindenburg ground crew. The retired history teacher and veteran of both World War II and the Korean War recently shared his memory of the Hindenburg disaster with me. Based on our conversation, I annotated the document Ganz uncovered with added detail. Megan Gambino is an editor and writer for Smithsonian.com and founded “Document Deep Dive.” Previously, she worked for Outside magazine in New Mexico.
ad9447f8d5fa21e9548e860cff6aac5e
https://www.smithsonianmag.com/history/document-deep-dive-richard-nixons-application-join-fbi-180950329/
Document Deep Dive: Richard Nixon’s Application to Join the FBI
Document Deep Dive: Richard Nixon’s Application to Join the FBI The abridged biography of Richard Nixon, as most know it, goes something like this. Born the son of a grocer and housewife, Nixon grew up in southern California and attended Whittier College, a small liberal arts college less than 20 miles from Los Angeles. He graduated from Duke University’s law school, moved home to California and started practicing law. He was first elected as a U.S. congressman in 1946 and then a senator in 1950, then served as vice president and eventually the president, before resigning in the wake of the Watergate scandal. The National Archives, however, adds a surprising little insert into that timeline. That is, a 24-year-old Nixon applied to be a special agent in the FBI in 1937. Submitted on April 23, Nixon’s application, once part of the FBI’s files, is now in the holdings of the National Archives. For likely the first time ever, the document is on display to the public in “Making Their Mark: Stories Through Signatures,” an exhibition featuring more than 100 signed artifacts at the archives through January 5, 2015. “It is a nice window into a moment in Richard Nixon’s life that people probably don’t think about,” says Jennifer Johnson, the exhibition’s curator. “He has just finished law school, and like everyone, he is clearly trying to figure out what he wants to do.” As the story goes, Nixon attended a lecture by an FBI special agent while studying at Duke. Just before he graduated with his law degree in June, 1937, he formally applied to the bureau. He was contacted for an interview, which he did in July of that year, and completed a physical exam at the request of J. Edgar Hoover, director of the FBI. But, after that, it was radio silence. He never received a response. On June 11, 1954, the then-Vice President Richard Nixon spoke at the FBI National Academy’s graduation. Hoover actually introduced him, saying that he took special pleasure in doing so, because Nixon had once applied to the bureau. “Having already embarked upon the practice of law, the FBI’s loss ultimately became the country’s gain,” remarked Hoover. Nixon, in a later address to the academy, said, “he never heard anything from that application.” In his memoirs, Nixon describes being at a party during his vice presidency, when he approached Hoover and expressed an interest in knowing what had happened. The exchange prompted the FBI to open Nixon’s file. Apparently, Nixon was accepted, but his appointment was revoked in August 1937, before he was ever notified. The details are murky. According to Nixon, Hoover told him that he was ultimately not hired due to budget cuts made to the bureau that year. But, it has also been said that Nixon’s plan to take the California bar exam in September didn’t jibe with the FBI’s hiring schedule. Either way, it is an interesting game of “what if,” says Johnson. Megan Gambino is an editor and writer for Smithsonian.com and founded “Document Deep Dive.” Previously, she worked for Outside magazine in New Mexico.
f88645123b4fd1b318f7d7e55c874c30
https://www.smithsonianmag.com/history/document-deep-dive-what-does-the-magna-carta-really-say-166954663/
Document Deep Dive: What Does the Magna Carta Really Say?
Document Deep Dive: What Does the Magna Carta Really Say? Last month, the 1297 Magna Carta, a prized artifact at the National Archives in Washington, D.C., returned to view after ten months of conservation work. With funds from the document’s owner David M. Rubenstein, conservators at the archives used ultra-violet photography to reveal text that was lost to the naked eye due to water damage. They also removed old repairs and adhesives that were causing the document to contract, humidified and flattened the parchment and placed it in a high-tech case filled with inert argon gas, all to ensure that it is preserved long into the future. “We have every reason to believe that 800 years from now it will be in fabulous shape,” said Kitty Nicholson, deputy director of the National Archives Conservation Lab. It was nearly 800 years ago, after all, on June 15, 1215, that a group of noblemen presented the first version of Magna Carta to King John at Runnymede, just over 20 miles west of London on the River Thames. In the charter, the barons of England’s feudal system listed demands that would protect their rights and prevent tyranny. King John, who had been abusing his power, at first agreed to the stipulations set forth in the document. But weeks later, when the agreement was annulled, civil war broke out, a war that ultimately claimed the king’s life. During the reigns of King John, his son Henry III and grandson Edward I, the charter was revised several times. Today, 17 original versions of Magna Carta, penned from 1215 to 1297, survive. Rubenstein, co-founder of the Carlyle Group, purchased one of four existing originals of the 1297 Magna Carta at auction in 2007 for $21.3 million. “This is the one that is really the law of the land of England,” said Rubenstein at the National Archives in Washington, D.C. this February. Whereas the 1215 Magna Carta was abrogated, King Edward I accepted the 1297 version and made it law by adding it to the Statute Rolls of England. This particular document also has the distinction of being the only Magna Carta that is privately owned and that resides in the United States. Rubenstein has permanently loaned it to the National Archives. Texas billionaire Ross Perot, its previous owner, had bought the charter in 1984 from the Brudenells, an English family who possessed it for centuries. The newly encased Magna Carta is presented in a way that makes the document more accessible to the public. For the first time, visitors to the National Archives can read the charter in English on touch-screen monitors installed on either side of it. (The original is in Latin.) They can navigate the document and read about what was going on at the time in England to prompt the noblemen’s petitions. The tool also highlights the ways in which Magna Carta influenced the Declaration of Independence, the Constitution and the Bill of Rights, displayed in an adjoining rotunda. Here, Alice Kamps, a curator at the National Archives, annotates a translation of the 1297 Magna Carta, providing context for specific parts and drawing connections to America’s Charters of Freedom. Click on the yellow tabs to read her comments.        ​ Megan Gambino is an editor and writer for Smithsonian.com and founded “Document Deep Dive.” Previously, she worked for Outside magazine in New Mexico.
ba70bf6f65443221369af7e55a86cee9
https://www.smithsonianmag.com/history/dogs-front-lines-artifact-smuggling-180968398/
Dogs May Soon Be on the Front Lines in the Fight Against Artifact Smuggling
Dogs May Soon Be on the Front Lines in the Fight Against Artifact Smuggling At the University of Pennsylvania, Roxie, Moxie, Pacy, Scout and Grizzly are ready for their first archaeology class. The pupils—four Labradors and a German shepherd—each take a turn with a trainer in a quiet room to focus on the task at hand: sniffing cotton that had been sealed in a bag with bits of ancient Syrian pottery, and then getting a treat. At the Penn Vet Working Dog Center, researchers have taught dogs to detect bombs, drugs, arson, people, even cancer. But this is the first time dogs are learning to recognize the smell of artifacts. The goal is to reduce the smuggling of archaeological treasures from Syria and Iraq, where looting has skyrocketed as a source of funding for terrorist groups. The K-9 Artifact Finders project is a collaboration with Penn’s Museum of Archaeology and Anthropology and Red Arch Cultural Heritage Law and Policy Research, a nonprofit trying to stem the trafficking. “How do you keep these smuggled cultural artifacts from crossing borders without searching every shipment, every suitcase?” asks Ricardo St. Hilaire, the executive director of Red Arch. The K-9 initiative was his idea, a way to see if detection dogs can curb trafficking at airports, seaports and other places stolen antiquities slip through security. After the dogs learn to identify the scent of the artifacts, they will be tested for sensitivity, says WDC director Cynthia Otto. Can canines smell the difference between pottery pieces that have sat in a museum for years and those that are recently excavated—and thus more likely to have been looted from an archaeological site? Do antiquities from Syria smell the same as ones from Iraq? A couple of weeks after their first sniff, all the canines in this inaugural class recognize the odor of the pottery. Moxie has caught on especially quickly. Circling a steel disk with compartments hiding various items—rubber gloves, paper, plain cotton—she stops suddenly at the one containing the pottery-scented cotton and noses it enthusiastically. “Woo-hoo!” the trainer trills. Tail wagging, Moxie trots over to get a small bite of hot dog, then continues her training as a soldier in the war on terror. This article is a selection from the April issue of Smithsonian magazine Christine Speer Lejeune is a writer whose work has appeared in Philadelphia magazine and the Nashville City Paper.
48b7bdfb4c401238fbdf456b2d15c565
https://www.smithsonianmag.com/history/driving-while-black-has-been-around-long-cars-have-existed-180958598/
“Driving While Black” Has Been Around As Long As Cars Have Existed
“Driving While Black” Has Been Around As Long As Cars Have Existed For African-American travelers in the Jim Crow-era South—often journeying from the north to visit relatives who had not joined the Great Migration—an unprepossessing paper-bound travel guide often amounted to a survival kit. The Green Book often functioned as a lifesaver. Visionary publisher-entrepreneur Victor Green, a Harlem postal carrier, introduced the travel guide in 1937. For blacks denied access to restaurants, hotels and restrooms—and who often risked even greater danger if they were driving after dark—it was an essential resource, listing hundreds of establishments, across the South and the nation, that welcomed African-Americans. Before the 1964 Civil Rights Act outlawed segregation, the Green Book sold in the millions and was passed from family member to family member.  For those who relied on it, it amounted to an essential safety precaution. Today, it is a potent artifact of discrimination. The Green Book is also the subject of filmmaker Ric Burns’s forthcoming documentary. Burns is exploring the Green Book as a window into history, and into the present, where the experience of driving while black is again at the center of our national conversation. I spoke with Burns about what he’s learned so far in making this film. How did you encounter the Green Book originally? A colleague of mine named Gretchen Sorin, who runs a Cooperstown Museum institute, is an extraordinary historian who did her dissertation on the Green Book decades ago. And she approached me some time ago and said, “Let’s do a film about this.” And there’s nobody who knows more about the Green Book than her. And she just really sort of made it her own, did oral histories, went to many of the places, has collected over a couple of decades an amazing archive of material. And what drew you to the Green Book project? I was born in 1955, so anybody who’s got roots through their own life or their parents or their grandparents, during the era when America became a car culture. Right. You know, all those things like the old Esso sign, motels, Howard Johnson’s. It’s part of the inner imaginary of America. And what non-African-American Americans don’t know is that that story has a completely different cast to it. It just unfolded in a completely different way, so as you’re driving into Greenville, Texas, across whose main street the banner reads, “Greenville, Texas. The black is soil, the white is people.” You’re having a different experience in the family car. We’re making a film called “Driving While Black,” which is covering this period when suddenly the automobile dawns for black Americans as it does for all Americans. It’s like mobility. You have agency. You’re not dependent on somebody else’s timetable or schedule. You go where you want, when you want. But for black Americans, suddenly, the whole question of mobility and race in America is a huge powder keg. Now you as a black person are crossing white space. What happens when your car breaks down? What happens when you need to get gas? What happens when your four-year-old needs to go to the bathroom? Where are you going to eat? Where are you going to sleep? God forbid something should happen like a car accident, a medical emergency. How are you going to get to the hospital? What hospital will take you? I mean, this whole inventory of experience. All of which we are so deeply intimately in the homeliest way, associated with the American experience. I mean, it’s all this simple stuff. As soon as there was a car, there was that agency, but there was also those challenges. [This film] is an opportunity to fill in a blank spot on the inner map of America. Where you kind of go, “Well, there’s the Civil War and then there’s something called Reconstruction, maybe Jim Crow means something to people, but really what’s something that organizes credibly and resonantly, the experience of race in America in the 1920s down through the Civil Rights Movement?” What were some of the unexpected discoveries that you’ve made with sources? What were some of your surprises during the time that you’ve been excavating this? We’re right in kind of the first phases of it, just beginning filming it. So those surprises are still to come. But I’ll say, the incredible thing about this topic, this whole area, is a surprise for non-African-American Americans. Because what dawns on you is that there’s a reality that you never really understood existed. And once it’s there, that surprising revelation is completely transformative. One of the things that made the automobile so enfranchising for black Americans was that it was a little hard to see who was driving a car. As [Nobel laureate and economist] Gunnar Myrdal put it, equality begins at around 25 miles per hour. All these elaborate codes (e.g. black Americans must stop and give way to white Americans) begin to go by the wayside. You’re kind of in your own self-enclosed world as you move through the highway world of America. And you have what contact you wish to have. And you can also not have contact if you wish not to have contact. That made this experience one which was both all too familiar in ways that were happy for black Americans and also very, very frustrating, and sometimes lethal. And for white Americans, completely unknown. The Negro Motorist’s Green Book. And it was just one of many. The Go Guide, the Travel Guide. The Travel Guide has this wonderful slogan on the cover: “Vacation and Recreation Without Humiliation.” Oh, that’s great. I loved the fact that Victor Green truncated the great Mark Twain quote, “Travel is fatal to prejudice” and put it on the cover of every issue of the thing. But the whole quote is, “Travel is fatal to prejudice, bigotry, and narrow-mindedness and many of our people need it sorely on these accounts.” What else have you learned? If you were a musician or an athlete, you were doing a lot of traveling around America, and cars made it much easier to get to where you wanted to go, and Green Books made it easier to find the places to stay; nevertheless, driving while black was always really difficult. There’s kind of a painful existential bottom line here. It’s integrated into the reality of American experience. Thurgood Marshall has an incredible story about the “sundown town.” He’s in Shreveport and basically the police are saying, “Nigger boy, what are you doing here? You better be out of town before sundown.” Who but African-Americans happen to have in their heads “sundown town” as a reality? It’s not for nothing that the last Green guide is published in 1966. And it’s not for nothing that Victor Green said in his editor’s note in the beginning, the time will come and I hope it comes soon, that this guide will no longer be necessary. But until it is, happy motoring, folks. And there’s all sorts of stuff. Esso, the kind of way in which commerce and consumerism and capitalism saw ways of marketing to new demographics, so God bless Esso, now Exxon. They saw the opportunity and went, you know what? We’re reaching out. And the reason why we’re having this conversation is because of the relationship that Victor Green established with Standard Oil. Exactly, exactly. And that put the Green Book on the map in a particularly special way. My family, when we drove our American Rambler into an Esso station in 1958 in Delaware. Even though I could ask my mother and father, and I did in Rehoboth, Delaware. There may not quite be sundown towns in Pennsylvania or Michigan, maybe in name only they’re not sundown towns. When you sort of think about the overall narrative arc, do you see a sort of overall beginning, middle, end narrative arc that’s going to be imposed on this film yet? We have a strong idea of it. The main narrative picks up when the automobile goes national. And when people who not just wealthy people can afford it. It’s roughly contemporaneous with the Green Book. First edition, ’36; last edition, ’66. Really, you know, the issue of mobility and the African-American experience in North America is connected from the start. There’s no way to understand that story without understanding what mobility and race meant from the time slaves were involuntary moved here. Or involuntary kept in place. So it’s going to be very important to not just to go, “Well, this just appeared just like a genie from a bottle,” you know, in 1925 when cars become more readily available to black as to white Americans too. You need to be able to understand that sure, we had Civil Rights in this country as a movement. Post-second World War, the ’50s, Brown v. Board of Education, the great steps forward in the 1960s, ’64, ’65. But there’s no African-American, male or female, who does not know what it means to have a special worry and special instructions… Gretchen Sorin’s son Greg works in my office. He got the talk from his dad. “Here’s what happens if and when you get stopped, and Greg, you’re going to get stopped. Keep your hands where they can see them. Don’t make any sudden movements, Greg.” Greg is 23; he was born in the 1990s. His father’s white, his mother’s black. I mean, this is an experience that is so current that that’s why we’ve chosen not to name the film “The Green Book,” but “Driving While Black.” In the 1941 edition and apparently in other editions, occasionally, people were contributing first-person essays. And in the 1941 edition, the essay is by a guy who took a trip to New England and into Canada to Quebec. And there is astonishment at the kind, hopeful, and civil encounter that they have in their first-person account with the police and a corner of the street in Quebec. So there’s that in there too. Race is the crucible of American history and we’re at another one of the crossroads. And we’re getting to know, “we” meaning non-black America, are getting to know in a more intimate way, what race and racism means. So the constitutional legal battles have been fought and at least in name, won. Now we’re moving to the areas of the economy, culture, the thoughts and feelings; the hearts and minds of human beings. That’s where there’s -- surprise, surprise—an enormous amount of work. And the confrontations are so painful. They just… We got a long way to go. And you know, the Green Book is kind of…enjoying a moment of public awareness. I’m looking at the pages, it’s quite visceral. It’s really visceral because …it’s where we all live. And so suddenly you realize what’s going on in plain sight. So it’s not some foreign vocabulary; it’s not happening somewhere else. It’s happening, you know… And it’s not a diner in a black-and-white 1960s civil rights kind of context. Right. You know it’s our experience and our parents’ experience and our grandparents’ experience. And doing this thing which is as American as apple pie: Getting in your car and going someplace. Whether it’s the afternoon or for the summer, or for a job, or to get away. And that right there in the middle of the open American road, we find these shadows and conflicts and really excruciating human circumstances. Kathleen Burke is a senior editor at ​Smithsonian​ magazine.
caa9ba075bd97ef7c999ce2643a8af10
https://www.smithsonianmag.com/history/duel-104161025/
Duel!
Duel! The story, as Parson Weems tells it, is that in 1754 a strapping young militia officer named George Washington argued with a smaller man, one William Payne, who made up for the disparity in size by knocking Washington down with a stick. It was the kind of affront that, among a certain class of Virginia gentlemen, almost invariably called for a duel. That must have been what Payne was expecting when Washington summoned him to a tavern the following day. Instead, he found the colonel at a table with a decanter of wine and two glasses. Washington apologized for the quarrel, and the two men shook hands. Whether or not this actually happened—and some biographers believe that it did—is almost beside the point. Weems’ intention was to reveal Washington as he imagined him: a figure of profound self-assurance capable of keeping an overheated argument from turning into something far worse. At a time in America when the code of the duel was becoming a law unto itself, such restraint was not always apparent. Alexander Hamilton was the most celebrated casualty of the dueling ethic, having lost his life in an 1804 feud with Aaron Burr on the fields ofWeehawken, New Jersey, but there were many more who paid the ultimate price— congressmen, newspaper editors, a signer of the Declaration of Independence (the otherwise obscure Button Gwinnett, famous largely for being named Button Gwinnett), two U.S. senators (Armistead T. Mason of Virginia and David C. Broderick of California) and, in 1820, the rising naval star Stephen Decatur. To his lasting embarrassment, Abraham Lincoln barely escaped being drawn into a duel early in his political career, and President Andrew Jackson carried in his body a bullet from one duel and some shot from a gunfight that followed another. Not that private dueling was a peculiarly American vice. The tradition had taken hold in Europe several centuries earlier, and though it was frequently forbidden by law, social mores dictated otherwise. During the reign of George III (1760-1820), there were 172 known duels in England (and very likely many more kept secret), resulting in 69 recorded fatalities. At one time or another, Edmund Burke, William Pitt the younger and Richard Brinsley Sheridan all took the field, and Samuel Johnson defended the practice, which he found as logical as war between nations: “Aman may shoot the man who invades his character,” he once told biographer James Boswell, “as he may shoot him who attempts to break into his house.” As late as 1829 the Duke of Wellington, then England’s prime minister, felt compelled to challenge the Earl of Winchelsea, who had accused him of softness toward Catholics. In France, dueling had an even stronger hold, but by the 19th century, duels there were seldom fatal, since most involved swordplay, and drawing blood usually sufficed to give honor its due. (Perhaps as a way of relieving ennui, the French weren’t averse to pushing the envelope in matters of form. In 1808, two Frenchmen fought in balloons over Paris; one was shot down and killed with his second. Thirty-five years later, two others tried to settle their differences by skulling each other with billiard balls.) In the United States, dueling’s heyday began at around the time of the Revolution and lasted the better part of a century. The custom’s true home was the antebellum South. Duels, after all, were fought in defense of what the law would not defend—a gentleman’s sense of personal honor—and nowhere were gentlemen more exquisitely sensitive on that point than in the future Confederacy. As self-styled aristocrats, and frequently slaveholders, they enjoyed what one Southern writer describes as a “habit of command” and an expectation of deference. To the touchiest among them, virtually any annoyance could be construed as grounds for a meeting at gunpoint, and though laws against dueling were passed in several Southern states, the statutes were ineffective. Arrests were infrequent; judges and juries were loath to convict. In New England, on the other hand, dueling was viewed as a cultural throwback, and no stigma was attached to rejecting it. Despite the furious sectional acrimony that preceded the Civil War, Southern congressmen tended to duel each other, not their Northern antagonists, who could not be relied upon to rise to a challenge. Consequently, when South Carolina congressman Preston Brooks was offended by Massachusetts senator Charles Sumner’s verbal assault on the congressman’s uncle, he resorted to caning Sumner insensible on the floor of the Senate. His constituents understood. Though Brooks was reviled in the North, he was lionized in much of the South, where he was presented with a ceremonial cane inscribed “Hit Him Again.” (Brooks said he had used a cane rather than a horsewhip because he was afraid Sumner might wrestle the whip away from him, in which case Brooks would have had to kill him. He didn’t say how.) Curiously, many who took part in the duel professed to disdain it. Sam Houston opposed it, but as a Tennessee congressman, shot Gen. William White in the groin. Henry Clay opposed it, but put a bullet through Virginia senator John Randolph’s coat (Randolph being in it at the time) after the senator impugned his integrity as secretary of state and called him some colorful names. Hamilton opposed dueling, but met Aaron Burr on the same ground in New Jersey where Hamilton’s eldest son, Philip, had died in a duel not long before. (Maintaining philosophical consistency, Hamilton intended to hold his fire, a common breach of strict dueling etiquette that, sadly, Burr didn’t emulate.) Lincoln, too, objected to the practice, but got as far as a dueling ground in Missouri before third parties intervened to keep the Great Emancipator from emancipating a future Civil War general. So why did such rational men choose combat over apology or simple forbearance? Perhaps because they saw no alternative. Hamilton, at least, was explicit. “The ability to be in future useful,” he wrote, “ . . . in those crises of our public affairs which seem likely to happen . . . imposed on me (as I thought) a peculiar necessity not to decline the call.” And Lincoln, though dismayed to be called to account for pricking the vanity of a political rival, couldn’t bring himself to extend his regrets. Pride obviously had something to do with this, but pride compounded by the imperatives of a dueling society. For a man who wanted a political future, walking away from a challenge may not have seemed a plausible option. The Lincoln affair, in fact, affords a case study in how these matters were resolved—or were not. The trouble began when Lincoln, then a Whig representative in the Illinois legislature, wrote a series of satirical letters under the pseudonym Rebecca, in which he made scathing fun of State Auditor James Shields, a Democrat. The letters were published in a newspaper, and when Shields sent him a note demanding a retraction, Lincoln objected to both the note’s belligerent tone and its assumption that he had written more of them than he had. (In fact, Mary Todd, not yet Lincoln’s wife, is believed to have written one of the letters with a friend.) Then, when Shields asked for a retraction of the letters he knew Lincoln had written, Lincoln refused to do so unless Shields withdrew his original note. It was a lawyerly response, typical of the verbal fencing that often preceded a duel, with each side seeking the moral high ground. Naturally, it led to a stalemate. By the time Lincoln agreed to a carefully qualified apology provided that first note was withdrawn— in effect asking Shields to apologize for demanding an apology—Shields wasn’t buying. When Lincoln, as the challenged party, wrote out his terms for the duel, hopes for an accommodation seemed ended. The terms themselves were highly unusual. Shields was a military man; Lincoln was not. Lincoln had the choice of weapons, and instead of pistols chose clumsy cavalry broadswords, which both men were to wield while standing on a narrow plank with limited room for retreat. The advantage would obviously be Lincoln’s; he was the taller man, with memorably long arms. “To tell you the truth,” he told a friend later, “I did not want to kill Shields, and felt sure that I could disarm him . . . ; and, furthermore, I didn’t want the damned fellow to kill me, which I rather think he would have done if we had selected pistols.” Fortunately, perhaps for both men, and almost certainly for one of them, each had friends who were determined to keep them from killing each other. Before Shields arrived at the dueling spot, their seconds, according to Lincoln biographer Douglas L. Wilson, proposed that the dispute be submitted to a group of fair-minded gentlemen—an arbitration panel of sorts. Though that idea didn’t fly, Shields’ seconds soon agreed not to stick at the sticking point. They withdrew their man’s first note on their own, clearing the way for a settlement. Shields went on to become a United States senator and a brigadier general in the Union Army; Lincoln went on to be Lincoln. Years later, when the matter was brought up to the president, he was adamant. “I do not deny it,” he told an Army officer who had referred to the incident, “but if you desire my friendship, you will never mention it again.” If Lincoln was less than nostalgic about his moment on the field of honor, others saw dueling as a salutary alternative to simply gunning a man down in the street, a popular but déclassé undertaking that might mark a man as uncouth. Like so many public rituals of the day, dueling was, in concept at least, an attempt to bring order to a dangerously loose-knit society. The Englishman Andrew Steinmetz, writing about dueling in 1868, called America “the country where life is cheaper than anywhere else.” Advocates of the duel would have said that life would have been even cheaper without it. Of course, the attitudes dueling was meant to control weren’t always controllable. When Gen. Nathanael Greene, a Rhode Islander living in Georgia after the Revolution, was challenged by Capt. James Gunn of Savannah regarding his censure of Gunn during the war, Greene declined to accept. But feeling the honor of the Army might be at stake, he submitted the matter to GeorgeWashington. Washington, who had no use for dueling, replied that Greene would have been foolish to take up the challenge, since an officer couldn’t perform as an officer if he had to worry constantly about offending subordinates. Indifferent to such logic, Gunn threatened to attack Greene on sight. Greene mooted the threat by dying peacefully the following year. Even more than Captain Gunn, Andrew Jackson was an excitable sort with a famously loose rein on his temper. Asurvivor— barely—of several duels, he nearly got himself killed following a meeting in which he was merely a second, and in which one of the participants, Jesse Benton, had the misfortune to be shot in the buttocks. Benton was furious, and so was his brother, future U.S. senator Thomas Hart Benton, who denounced Jackson for his handling of the affair. Not one to take denunciation placidly, Jackson threatened to horsewhip Thomas and went to a Nashville hotel to do it. When Thomas reached for what Jackson supposed was his pistol, Jackson drew his, whereupon the irate Jesse burst through a door and shot Jackson in the shoulder. Falling, Jackson fired at Thomas and missed. Thomas returned the favor, and Jesse moved to finish off Jackson. At this point, several other men rushed into the room, Jesse was pinned to the floor and stabbed (though saved from a fatal skewering by a coat button), a friend of Jackson’s fired at Thomas, and Thomas, in hasty retreat, fell backward down a flight of stairs. Thus ended the Battle of the City Hotel. It was just this sort of thing that the code of the duel was meant to prevent, and sometimes it may have actually done so. But frequently it merely served as a scrim giving cover to murderers. One of the South’s most notorious duelists was a hard-drinking homicidal miscreant named Alexander Keith McClung. Anephew of Chief Justice John Marshall—though likely not his favorite nephew, after engaging in a duel with a cousin—McClung behaved like a character out of Gothic fiction, dressing from time to time in a flowing cape, giving overripe oratory and morbid poetry, and terrifying many of his fellow Mississippians with his penchant for intimidation and violence. A crack shot with a pistol, he preferred provoking a challenge to giving one, in order to have his choice of weapons. Legend has it that after shooting Vicksburg’s John Menifee to death in a duel, the Black Knight of the South, as Mc- Clung was known, killed six other Menifees who rose in turn to defend the family honor. All of this reportedly generated a certain romantic excitement among women of his acquaintance. Wrote one: “I loved him madly while with him, but feared him when away from him; for he was a man of fitful, uncertain moods and given to periods of the deepest melancholy. At such times he would mount his horse, Rob Roy, wild and untamable as himself, and dash to the cemetery, where he would throw himself down on a convenient grave and stare like a madman into the sky. . . . ” (The woman refused his proposal of marriage; he didn’t seem the domestic type.) Expelled from the Navy as a young man, after threatening the lives of various shipmates, McClung later served, incredibly, as a U.S. marshal and fought with distinction in the Mexican War. In 1855, he brought his drama to an end, shooting himself in a Jackson hotel. He left behind a final poem, “Invocation to Death.” Though the dueling code was, at best, a fanciful alternative to true law and order, there were those who believed it indispensable, not only as a brake on shoot-on-sight justice but as a way of enforcing good manners. New Englanders may have prided themselves on treating an insult as only an insult, but to the South’s dueling gentry, such indifference betrayed a lack of good breeding. John Lyde Wilson, a former governor of South Carolina who was the foremost codifier of dueling rules in America, thought it downright unnatural. Ahigh-minded gentleman who believed the primary role of a second was to keep duels from happening, as he had done on many occasions, he also believed that dueling would persist “as long as a manly independence and a lofty personal pride, in all that dignifies and ennobles the human character, shall continue to exist.” Hoping to give the exercise the dignity he felt sure it deserved, he composed eight brief chapters of rules governing everything from the need to keep one’s composure in the face of an insult (“If the insult be in public . . . never resent it there”) to ranking various offenses in order of precedence (“When blows are given in the first instance and returned, and the person first striking be badly beaten or otherwise, the party first struck is to make the demand [for a duel or apology], for blows do not satisfy a blow”) to the rights of a man being challenged (“You may refuse to receive a note from a minor. . . , [a man] that has been publicly disgraced without resenting it. . . , a man in his dotage [or] a lunatic”). Formal dueling, by and large, was an indulgence of the South’s upper classes, who saw themselves as above the law— or at least some of the laws—that governed their social inferiors. It would have been unrealistic to expect them to be bound by the letter of Wilson’s rules or anyone else’s, and of course they were not. If the rules specified smoothbore pistols, which could be mercifully inaccurate at the prescribed distance of 30 to 60 feet, duelists might choose rifles or shotguns or bowie knives, or confront each other, suicidally, nearly muzzle to muzzle. If Wilson was emphatic that the contest should end at first blood (“no second is excusable who permits a wounded friend to fight”), contestants might keep on fighting, often to the point where regret was no longer an option. And if seconds were obliged to be peacemakers, they sometimes behaved more like promoters. But if bending the rules made dueling even bloodier than it had to be, strict adherence could be risky too. Some would-be duelists discovered that even the code’s formal preliminaries might set in motion an irreversible chain of events. When, in 1838, Col. James Watson Webb, a thuggish Whig newspaper editor, felt himself abused in Congress by Representative Jonathan Cilley, a Maine Democrat, he dispatched Representative William Graves of Kentucky to deliver his demand for an apology. When Cilley declined to accept Webb’s note, Graves, following what one Whig diarist described as “the ridiculous code of honor which governs these gentlemen,” felt obliged to challenge Cilley himself. Subsequently, the two congressmen, who bore each other not the slightest ill will, adjourned to a field in Maryland to blast away at each other with rifles at a distance of 80 to 100 yards. After each exchange of shots, negotiations were conducted with a view to calling the whole thing off, but no acceptable common ground could be found, though the issues still at stake seemed appallingly trivial. Graves’ third shot struck Cilley and killed him. Though President Van Buren attended Cilley’s funeral, the Supreme Court refused to be present as a body, as a protest against dueling, and Graves and his second, Representative Henry Wise of Virginia, were censured by the House of Representatives. On the whole, though, outrage seemed to play out along party lines, with Whigs less dismayed by the carnage than Democrats. Congressman Wise, who had insisted the shooting continue, over the protests of Cilley’s second, was particularly defiant. “Let Puritans shudder as they may,” he cried to his Congressional colleagues. “I belong to the class of Cavaliers, not to the Roundheads.” Ultimately, the problem with dueling was the obvious one. Whatever rationale its advocates offered for it, and however they tried to refine it, it still remained a capricious waste of too many lives. This was especially true in the Navy, where boredom, drink and a mix of spirited young men in close quarters on shipboard produced a host of petty irritations ending in gunfire. Between 1798 and the Civil War, the Navy lost two-thirds as many officers to dueling as it did to more than 60 years of combat at sea. Many of those killed and maimed were teenage midshipmen and barely older junior officers, casualties of their own reckless judgment and, on at least one occasion, the by-the-book priggishness of some of their shipmates. In 1800, Lt. Stephen Decatur, who was to die in a celebrated duel 20 years later, laughingly called his friend Lieutenant Somers a fool. When several of his fellow officers shunned Somers for not being suitably resentful, Somers explained that Decatur had been joking. No matter. If Somers didn’t challenge, he would be branded a coward and his life made unbearable. Still refusing to fight his friend Decatur, Somers instead challenged each of the officers, to be fought one after another. Not until he had wounded one of them, and been so seriously wounded himself that he had to fire his last shot from a sitting position, would those challenged acknowledge his courage. The utter pointlessness of such encounters became, in time, an insult to public opinion, which by the Civil War had become increasingly impatient with affairs of honor that ended in killing. Even in dueling’s heyday, reluctant warriors were known to express reservations about their involvement by shooting into the air or, after receiving fire, not returning it. Occasionally they chose their weapons—howitzers, sledgehammers, forkfuls of pig dung—for their very absurdity, as a way of making a duel seem ridiculous. Others, demonstrating a “manly independence” that John Lyde Wilson might have admired, felt secure enough in their own reputations to turn down a fight. It may not have been difficult, in 1816, for New Englander Daniel Webster to refuse John Randolph’s challenge, or for a figure as unassailable as Stonewall Jackson, then teaching at the Virginia Military Institute, to order court-martialed a cadet who challenged him over a supposed insult during a lecture. But it must have been a different matter for native Virginian Winfield Scott, a future commanding general of the Army, to turn down a challenge from Andrew Jackson after the War of 1812. (Jackson could call him whatever he chose, said Scott, but he should wait until the next war to find out if Scott were truly a coward.) And it had to be riskier still for Louisville editor George Prentice to rebuke a challenger by declaring, “I do not have the least desire to kill you. . . . and I am not conscious of having done anything to entitle you to kill me. I do not want your blood upon my hands, and I do not want my own on anybody’s. . . . I am not so cowardly as to stand in dread of any imputation on my courage.” If he did not stand in such dread, others did, since the consequences of being publicly posted as a coward could ruin a man. Yet even in dueling’s heartland south of the Mason- Dixon line, the duel had always had its opponents. Anti-dueling societies, though ineffectual, existed throughout the South at one time, and Thomas Jefferson once tried in vain to introduce in Virginia legislation as strict—though surely not so imaginative—as that in colonial Massachusetts, where the survivor of a fatal duel was to be executed, have a stake driven through his body, and be buried without a coffin. But time was on the side of the critics. By the end of the Civil War, the code of honor had lost much of its force, possibly because the country had seen enough bloodshed to last several lifetimes. Dueling was, after all, an expression of caste—the ruling gentry deigned to fight only its social nearequals— and the caste whose conceits it had spoken to had been fatally injured by the disastrous war it had chosen. Violence thrived; murder was alive and well. But for those who survived to lead the New South, dying for chivalry’s sake no longer appealed. Even among old dueling warriors, the ritual came to seem like something antique. Looking back on life’s foolishness, one South Carolina general, seriously wounded in a duel in his youth, was asked to recall the occasion. “Well I never did clearly understand what it was about,” he replied, “but you know it was a time when all gentlemen fought.” - ROSS DRAKE is a former editor at People magazine who now writes from Connecticut. This is his first article for SMITHSONIAN.
5e53aaa731f239edcd40a2fcb3872fc5
https://www.smithsonianmag.com/history/duplication-nation-3D-printing-rise-180954332/
How the Photocopier Changed the Way We Worked—and Played
How the Photocopier Changed the Way We Worked—and Played Recently I visited Whisk, a Manhattan store that sells kitchen goods, and next to the cash register was a strange, newfangled device: a 3-D printer. The store bought the device—which creates objects by carefully and slowly extruding layers of hot plastic—to print cookie cutters. Any shape you can think of, it can produce from a digital blueprint. There was a cutter in the shape of a thunderbolt, a coat of arms, a racing car. “Send it in the morning and we’ll have it ready in a week or two,” the store clerk told me. I wouldn’t even need to design my own cookie cutter. I could simply download one of hundreds of models that amateurs had already created and put online for anyone to use freely. In the world of 3-D printers, people are now copying and sharing not just text and pictures on paper, but physical objects. Once, 3-D printers were expensive, elite tools wielded by high-end designers who used them to prototype products like mobile phones or airplane parts. But now they’re emerging into the mainstream: You can buy one for about $500 to $3,000, and many enthusiasts, schools and libraries already have. Sometimes they print objects they design, but you can also make copies of physical objects by “scanning” them—using your smartphone or camera to turn multiple pictures into a 3-D model, which can then be printed over and over. Do you want a copy of, say, the Auguste Rodin statue Cariatide à l’urne—or maybe just some replacement plastic game pieces for Settlers of Catan? You’re in luck. Helpful folks have already scanned these objects and put them online. As 3-D printing gets cheaper and cheaper, how will it change society? What will it mean to be able to save and share physical objects—and make as many copies as we’d like? One way to ponder that is to consider the remarkable impact of the first technology that let everyday people duplicate things en masse: The Xerox photocopier. For centuries, if you weren’t going to the trouble of publishing an entire book, copying a single document was a slow, arduous process, done mostly by hand. Inventors had long sought a device to automate the process, with limited success. Thomas Jefferson used a pantograph: As he wrote, a wooden device connected to his pen manipulated another pen in precisely the same movements, creating a mechanical copy. Steam-engine pioneer James Watt created an even cruder device that would take a freshly written page and mash another sheet against it, transferring some of the ink in reverse. By the early 20th century, the state of the art was the mimeograph machine, which used ink to produce a small set of copies that got weaker with each duplication. It was imperfect. Then in 1959, Xerox released the “914”—the first easy-to-use photocopier. The culmination of more than 20 years of experimentation, it was a much cleaner, “dry” process. The copier created an electrostatic image of a document on a rotating metal drum, and used it to transfer toner—ink in a powdered format—to a piece of paper, which would then be sealed in place by heat. It was fast, cranking out a copy in as little as seven seconds. When the first desk-size, 648-pound machines were rolled out to corporate customers—some of whom had to remove doors to install these behemoths—the era of copying began. Or more accurately, the explosion of copying began. Xerox expected customers would make about 2,000 copies a month—but users easily made 10,000 a month, and some as many as 100,000. Before the 914 machine, Americans made 20 million copies a year, but by 1966 Xerox had boosted the total to 14 billion. “It was a huge change in the amount of information moving around,” said David Owen, author of Copies in Seconds, a history of Xerox. Indeed, it transformed the pathways through which knowledge flowed in a corporation. Before the Xerox, when an important letter arrived, only a small number of higher-ups clapped eyes on it. The original would circulate from office to office, with a “routing slip” showing who’d read it and where it should travel next. But after the photocopier arrived, employees began copying magazine articles and white papers they felt everyone else should see and circulating them with abandon. Wrote a memo? Why not send it to everyone? Copying was liberating and addicting. “The button waiting to be pushed, the whir of action, the neat reproduction dropping into the tray—all this adds up to a heady experience, and the neophyte operator of a copier feels an impulse to copy all the papers in his pockets,” as John Brooks wrote in a 1967 New Yorker article. White-collar workers had complained of information overload before. But the culprit was industrial processes—book publishers, newspapers. The photocopier was different. It allowed the average office drone to become an engine of overload, handing stacks of material to bewildered colleagues. “You’d have this huge pile of meeting documents,” Owen says with a laugh, “and nobody has read them.” Copying also infected everyday life. Employees would sneak their own personal items on the machine, copying their IRS returns, party invitations, recipes. Chain letters began demanding participants not only forward the letter, but send out 20 copies—because, hey, now anyone could! And people quickly realized they could make paper replicas of physical objects, placing their hands—or, whipping down their pants, their rear ends—on the copier glass. This copying of objects could be put to curiously practical purposes. Instead of describing the physical contents of a perp’s pockets when jailing him, police would just dump them onto the 914’s glass and hit copy. The bizarre welter of things being replicated made even the folks at Xerox worry they had unleashed Promethean forces. “Have we really made a contribution by making it easier to reproduce junk and nonsense?” as Sol Linowitz, CEO of Xerox International, fretted in Life magazine. Yet for everyday people, replicating nonsense was the best part of the copier—an illicit thrill. Hiding behind the anonymity of a duplicated document, office workers began circulating off-color jokes and cartoons. Sometimes it was fake memos that savagely mocked the idiocy of office life—a “Rush Job” calendar with jumbled dates, so a customer could “order his work on the 7th and have it delivered on the 3rd,” or an “organization chart” cartoon that consisted of an executive being kissed on the ring by a lesser executive, who also has a lesser executive kissing his ring, and on and on. Jokes about the intelligence of various ethnic groups abounded, as did sexually explicit material. Eye-popping cartoons depicted the “Peanuts” characters having sex. “There were these copies where you had a Rorschach blot and you had to fold it and hold it up to the light, and there were people having sex in more positions than you could imagine,” says Michael Preston, a professor emeritus of English at the University of Colorado at Boulder, who published an early collection of what he called Xerox-lore—the folklore of the copying age. Artists, too, flocked to the device, thrilled by the high-contrast, low-fi prints it produced—so unlike either photography or traditional printing. As they showed, photocopying had an aesthetic. “When I show it a hair curler it hands me back a space ship, and when I show it the inside of a straw hat it describes the eerie joys of a descent into a volcano,” said Pati Hill, an artist who became famous for using a photocopier. In essence, the photocopier was not merely a vehicle for copying. It became a mechanism for sub-rosa publishing—a way of seizing the means of production, circulating ideas that would previously have been difficult to get past censors and editors. “Xerography is bringing a reign of terror into the world of publishing, because it means that every reader can become both author and publisher,” Marshall McLuhan wrote in 1966. This had powerful political effects. Secrets were harder to keep, documents easier to leak. Daniel Ellsberg used a copier to reproduce the Pentagon Papers (even having his children help make the replicas at a friend’s office). Fearful of the copier’s power, the Soviet Union tightly controlled access to the machines. In the United States, activists for ACT-UP—the group that fought to have AIDS taken more seriously by doctors and politicians—had a powerful impact in part because they had access to copiers. Many worked at media giants like Condé Nast and NBC, and after doing their work would run off thousands of copies of fliers and posters they’d use to plaster New York City for AIDS-awareness campaigns. “They’d go in to do the paste-up for all these magazines, and then they would make thousands of posters and fliers that were so integral to what ACT-UP was doing,” notes Kate Eichhorn, an assistant professor at the New School who is writing a book about copiers. “These huge corporations were underwriting this radical activism.” This same force catalyzed the world of alternative culture: Fans of TV shows, sci-fi or movies began to produce zines, small publications devoted to their enthusiasms. The Riot Grrrl movement of young feminist musicians in the ’90s, appalled by mainstream media’s treatment of women, essentially created their own mediasphere partly via photocopiers. “Beyond its function as an ‘office tool,’ the copier has, for many people, become a means of self-expression,” said the authors of Copyart, a 1978 guide to DIY creativity. But all that copying worried traditional authors: Surely they were losing sales if someone could copy a chapter from a book, or an article from a magazine, without paying for the original. Libraries and universities were hotbeds of so much duplication that publishers eventually took their complaints to the courts—and, in the ’70s, lost. The courts, and Congress, decided that making copies for personal use was fine. “It was really a great moment in the late ’70s when it was a wonderful loosening of copyright,” says Lisa Gitelman, professor of English and media studies at New York University. These days, Congress is working hard­—often at the behest of movie studios or record labels—in the opposite direction, making it harder for people to copy things digitally. But back in the first cultural glow of the Xerox, lawmakers and judges came to the opposite conclusion: Copying was good for society. Clive Thompson is author of Smarter Than You Think: How Technology is Changing Our Minds for the Better and Coders: The Making of a New Tribe and the Remaking of the World. He is a contributing writer to the New York Times Magazine and Wired. Photo: Tom Igoe.
8ec40be47824214306f9cf86ded0b5c1
https://www.smithsonianmag.com/history/during-cold-war-ci-secretly-plucked-soviet-submarine-ocean-floor-using-giant-claw-180972154/
During the Cold War, the C.I.A. Secretly Plucked a Soviet Submarine From the Ocean Floor Using a Giant Claw
During the Cold War, the C.I.A. Secretly Plucked a Soviet Submarine From the Ocean Floor Using a Giant Claw In a corner exhibit of the recently reopened International Spy Museum in Washington, D.C., a submarine control panel, a swoopy-banged wig, detailed whiteprints and a chunk of manganese are on display. Together, they represent relics of a Cold War espionage mission so audacious, the museum’s curator, Vince Houghton, compares it to the heist from Ocean’s 11. This mission, codenamed Project Azorian, involved the C.I.A. commissioning the construction of a 600-foot ship to retrieve a sunken Soviet submarine from the ocean floor—all in complete secrecy. “I can’t imagine there’s another country in the world that would have thought, ‘We found a Soviet submarine, under [more than three miles] of water. Let’s go steal it,’ says Houghton. The six-year mission began in 1968, when the Soviet ballistic missile submarine K-129 went missing without explanation somewhere in the Pacific Ocean. In this post-Cuban Missile Crisis era, both American and Soviet submarines prowled the open seas with nuclear weapons aboard, prepared for potential war. Some reports indicate that the sinking was due to a mechanical error such as inadvertent missile engine ignition, while the Soviets for a time suspected the Americans of foul play. After two months, the Soviet Union abandoned its search for K-129 and the nuclear weapons it carried, but the United States, which had recently used Air Force technology to locate two of its own sunken submarines, pinpointed the K-129 1,500 miles northwest of Hawaii and 16,500 feet below the surface. According to the declassified C.I.A. history of the project, “No country in the world had succeeded in raising an object of this size and weight from such a depth.” Internally, the intelligence community deliberated about the cost-to-reward ratio of such an expensive and risky undertaking even as the submarine offered a tantalizing trove of information. According to Houghton, the value of the K-129 stemmed not just from the code books and nuclear warheads onboard, but also the chance to understand the manufacturing process behind the rival power’s submarines. If the U.S. knew how the K-129’s sonar systems operated, or the mechanisms by which the submarines kept quiet, they could improve their ability to detect them. And by 1967, the Soviet Union had amassed an armament of nuclear weapons large enough that the two nations had “virtual nuclear parity,” Houghton explains. As a result, the Americans were hungry to gain a competitive advantage—an edge the K-129 might provide. The C.I.A. brainstormed several improbable-sounding means of recovering the submarine. One suggestion involved generating enough gas on the ocean floor to buoy the submarine to the surface. Instead, they settled on an idea reminiscent of the classic arcade game—a giant claw that would grasp and pull the K-129 into the “moon pool” belly of a giant ship. Initially, the project boasted an estimated ten percent chance of success. (Granted, that figure increased as Azorian approached completion.) Legally speaking, the U.S. was concerned that the project could leave them open to charges of piracy if the Soviets had an inkling of the illicit submarine-salvaging plans. Wanting to sidestep diplomatic tensions and keep whatever knowledge was to be gleaned from the mission secret, the C.I.A. constructed an elaborate cover story with the help of enigmatic billionaire Howard Hughes. The aviation mogul lent his imprimatur to the construction of the 618-foot-long ship, to be named the Hughes Glomar Explorer, which was advertised as a deep-sea mining research vessel. In 1972, a champagne christening ceremony and fabricated press release celebrated the ship. When the ship first sailed from Pennsylvania to waters near Bermuda for testing in 1973, the Los Angeles Times noted the occasion, calling the vessel “shrouded in secrecy” and observing, “Newsmen were not permitted to view the launch, and details of the ship’s destination and mission were not released.” Evidently, the public and press chalked the mystery up to Hughes’ reputation as a recluse, such a loner that he was said to eschew even his own company’s board meetings. Next, the Glomar Explorer navigated to the Pacific around South America—because it was too wide to pass through the Panama Canal. After some minor foibles (the U.S.-assisted 1973 Chilean coup happened the same day as seven technicians were trying to board the ship in the country’s port city of Valparaíso), the Glomar Explorer arrived in Long Beach, California, where it loaded more than 20 vans full of equipment (including a darkroom, paper processing, nuclear waste handling) for analyzing the K-129’s contents. Meanwhile, a team built the claw (nicknamed “Clementine” and formally known as the “capture vehicle”) in a gargantuan floating barge called HMB-1 in Redwood City. In the spring of 1974, HMB-1 submerged and met up with the Glomar Explorer off the coast of Catalina Island in southern California. HMB-1 opened its roof, and the Glomar Explorer opened the bottom of its hollow “moon pool” to take the steel claw onboard. Then the HMB-1 detached and returned to Redwood City, the transfer unnoticed. That summer, the Glomar Explorer, with the approval of President Richard Nixon, set off towards the spot where the K-129 rested. By this point, the Cold War had reached a détente, but still, two separate Soviet ships (likely loaded with intelligence operatives) closely monitored the supposed mining vessel as it worked to retrieve the submarine. (At one point, Glomar crew members even piled crates on their landing deck to prevent any attempts to land a helicopter.) But the mission continued undetected—as the 274 pieces of heavy steel pipe that stretched between the claw and the ship were being slowly hauled back onboard, with the submarine in Clementine’s grasp, the second Soviet tug sailed away. After about a week of slow upward progress, Project Azorian finally completed the lift of the K-129—but only one part of it. According to Project AZORIAN: The CIA and the Raising of the K-129, a book co-written by naval historian Norman Polmar and documentary director Michael White, about midway through the process, a few of the grabber arms encircling the submarine broke, and a large part of the K-129 fell back to the ocean floor. While the later media reports and history books generally relayed that the more desirable components of the submarine, like the code room, sunk, Houghton encourages skepticism of the details surrounding the project’s ostensible failure. “The conventional wisdom has become that this was a failed mission,” he explains. “[The C.I.A. has] allowed that belief to be what everyone understands, but why would they not? I always say, ‘We have no idea what they got.’” (Many of the details in this story are sourced from C.I.A. declassified documents and recently published historical accounts, but since other findings from the mission are still classified, and the C.I.A. may have had reason to obfuscate the story, skepticism remains warranted.) We do know, however, that the Glomar Explorer retrieved the bodies of several of the K-129’s crewmembers, whom they gave a military burial at sea, which the C.I.A. filmed and gave to Russia almost 20 years later. Coincidentally, the retrieval also brought up manganese samples from the bottom of the sea, the material that the Glomar Explorer purportedly was researching. The U.S. seemed to have gotten away with the elaborate submarine heist—Ford’s secretary of defense, James Schlesinger, said in a White House meeting, “The operation is a marvel.” In early 1975, however, after a random robbery of the headquarters of Hughes’ Summa Corporation, which was acting as a front for the Glomar Explorer, the story made its way to the headlines of the Los Angeles Times and national television. The story broke later than it could have—famed New York Times reporter Seymour Hersh had been following it as early as 1973 but honored a request from C.I.A. director William Colby to suppress the story—and were riddled with inaccuracies. (The code name was thought to be “Jennifer,” which was actually referred only to its security procedures, and the L.A. Times report placed the recovery efforts in the Atlantic Ocean.) Nonetheless, it was enough to alert the Soviet Union and “disturb” (his words) President Ford. Project Matador, the plan to retrieve the rest of the K-129, apparently got nixed as news of the thought-to-have-failed mission and its rumored (but, Houghton says, ultimately unknowable) $300 million-plus price tag circulated. The C.I.A. also faced a diplomatic dilemma that spring. Pressed by the Soviet ambassador to the U.S. and Freedom of Information Act requests from journalists, they wanted to avoid directly acknowledging that they’d illicitly stolen a submarine from the watchful Soviets, but were obligated to somehow respond. “[The U.S. government] did not want to embarrass the Soviets,” Houghton says, “mainly because in doing so, [they] really set diplomacy back significantly, because the Soviet premier would have to respond” through sanctions or an attack on a territory. In the effort to walk this diplomatic tightrope and comply with FOIA requirements, the “Glomar response”—“we can neither confirm nor deny”—was coined. While the Glomar response stood up in federal court as a reason to deny a FOIA request, the incident, writes historian M. Todd Bennett, “intensified otherwise routine ‘Intelligence Wars,’ tit-for-tat actions taken by the Soviet and American intelligence services.” That May, Soviet operatives increased the amount of microwave radiation trained on the American embassy in Moscow. Forty-five years after the Glomar Explorer hauled (part of) the K-129 from the ocean floor, Project Azorian remains “legendary within the [intelligence] community,” Houghton says. The glass cases show the onesies worn by crew members onboard, phony belt-buckle “safety awards,” a barometer from the ship and even a wig C.I.A. deputy director Vernon Walters wore to pay the Glomar Explorer an incognito visit, but they also name-check engineer John Graham and display a scaled-down version of the detailed whiteprint used to design the now-defunct ship. Azorian stands out, Houghton says, because “it’s so bold, so ambitious, and it almost was guaranteed to fail.” And yet, although only part of the submarine was retrieved, the ship was built, the almost ridiculous proposition of a giant claw extending to the ocean floor proved functional, and despite the scale of the project, it stayed secret for seven years. The Spy Museum positions the Azorian saga as a paean to innovation, an exemplar of how the “unsolvable problems” of the intelligence world can be tackled with creativity and technological advances. Lila Thulin is the digital editorial assistant for Smithsonian magazine.
002fcb97b284c470660e1ded49861702
https://www.smithsonianmag.com/history/eastern-state-penitentiary-a-prison-with-a-past-14274660/
Eastern State Penitentiary: A Prison With a Past
Eastern State Penitentiary: A Prison With a Past In 1787, four years after the American Revolutionary War, the United States was a country brimming with possibility, and no city felt the excitement more than Philadelphia. Delegates such as Alexander Hamilton and James Madison were gathering at Independence Hall to draft what would later become the Constitution. That same year, a couple of blocks away from Independence Hall, at the home of Benjamin Franklin, another group of civic-minded leaders gathered to debate a wholly different matter: prison reform. Conditions at the Walnut Street Jail located directly behind Independence Hall were appalling. Men and women, adults and children, thieves and murderers were jailed together in disease-ridden, dirty pens where rape and robbery were common occurrences. Jailors made little effort to protect the prisoners from each other. Instead, they sold the prisoners alcohol, up to nearly twenty gallons of it a day. Food, heat, and clothing came at a price. It wasn't unusual for prisoners to die from the cold or starvation. A group of concerned citizens, calling themselves the Philadelphia Society for Alleviating the Miseries of Public Prisons, decided that this must not continue. What they would propose set the stage for prison reform not only in Pennsylvania, but also the world over. From its beginning, Pennsylvania was determined to be different from other colonies. Founder William Penn brought his Quaker values to the new colony, avoiding the harsh criminal code practiced in much of British North America, where death was the standard punishment for a litany of crimes, including the denial of the one "true God," kidnapping, and sodomy. Penn, instead, relied on imprisonment with hard labor and fines as the treatment for most crimes, while death remained the penalty only for murder. But upon Penn's passing in 1718, conservative groups did away with his Quaker-based system, and incorporated the harsh retributions that were the norm elsewhere. Jails simply became detention centers for prisoners as they awaited some form of corporal or capital punishment. It would take another seventy years before anyone would try to do away with this severe penal code. Dr. Benjamin Rush was a prominent Philadelphia physician with an interest in politics. In 1776, he served in the Second Continental Congress and signed the Declaration of Independence. More than a decade later, he would lead the push for ratification of the federal Constitution. He was an outspoken abolitionist, and would later earn the title "father of American psychiatry" for his groundbreaking observations about "diseases of the mind." As a newly minted doctor training in London in 1768, Rush ran into Benjamin Franklin who was then serving as an agent to Parliament for the Pennsylvania Assembly. Franklin, a celebrity among the Parisians, urged the curious twenty-two-year-old to cross the English Channel and experience the Enlightenment thinking that filled French parlors. The following year, Rush did. He mingled among scientists, philosophers and literati, listening to progressive European theories about such issues as crime and punishment that would eventually follow him to America. In 1787 Rush was back in the company of Franklin and his American contemporaries proclaiming that a radical change was needed not just at the jail on Walnut Street, but the world over. He was convinced that crime was a "moral disease," and suggested a "house of repentance" where prisoners could meditate on their crimes, experience spiritual remorse and undergo rehabilitation. This method would later be called the Pennsylvania System and the institution a penitentiary. The Philadelphia Society for Alleviating the Miseries of Public Prisons, also known as the Pennsylvania Prison Society, agreed, and set out to convince the Commonwealth of Pennsylvania. Changes were made at the Walnut Street Jail—inmates were segregated by sex and crime, vocational workshops were instituted to occupy the prisoners' time, and much of the abusive behavior was abolished—but it wasn't enough. Philadelphia's population was growing by leaps and bounds, and so was the criminal element. A prison of a grander scale was needed to fulfill the prison society's mission. For repentance to truly happen, the complete isolation of each prisoner would need to occur, and this was impossible to do in these overcrowded jails. Construction of Eastern State Penitentiary began on a cherry orchard outside of Philadelphia in 1822. The chosen design, created by British-born architect John Haviland, was unlike any seen before: seven wings of individual cellblocks radiating from a central hub. The penitentiary opened in 1829, seven years before completion, but the institution proved to be a technological marvel. With central heating, flush toilets, and shower baths in each private cell, the penitentiary boasted luxuries that not even President Andrew Jackson could enjoy at the White House Charles Williams, a farmer sentenced to two years for theft, would be inmate number one. On October 23, 1829, Williams was escorted into the new prison with an eyeless hood placed over his head. This was done to secure his anonymity and eventual integration into society upon release, as no one would recognize his face from the prison. But it also served another purpose: to ensure that there would be no chance at escape, as Williams would never see the prison beyond his private cell. Communication with guards was done through a small feeding hole. The inmates lived in complete isolation, with a Bible their only possession, and chores like shoemaking and weaving to occupy their time. Delegates from around the world came to study the famous Pennsylvania System. Alex de Tocqueville praised the concept, writing about his 1831 trip: "Can there be a combination more powerful for reformation than solitude...leads [a prisoner] through reflection to remorse, through religion to hope; makes him industrious by...idleness?" Others also agreed. More than 300 prisons throughout Europe, South America, Russia, China and Japan would be based on the Eastern State Penitentiary model. But some were not so convinced of the method. Charles Dickens, after his visit in 1842, wrote critically: "I am persuaded that those who designed this system... do not know what it is they are doing... I hold the slow and daily tampering with the mysteries of the brain to be immeasurably worse than any torture of the body." Dickens' doubt would prevail. In 1913, Eastern State gave up on the Pennsylvania System of isolation and penitence. Prisoners shared cells, worked together, and even played in organized sports. Francis Dolan, site manager of the Eastern State Penitentiary Historical Site, explains, "The solitary confinement system was nearly impossible to maintain given the technology of the early 19th century, and collapsed under the weight of it's own lofty morals." And just like the jail on Walnut Street, the penitentiary, says Dolan, "was doomed by the rapid growth of Philadelphia." What was meant to originally hold about 300 prisoners was, by the 1920s, forced to house some 2,000. More and more cells were constructed, including ones built below ground without windows, light or plumbing. Eventually, solitude wasn't about redemption, but punishment. By the 1960s, Eastern State Penitentiary was falling apart. In 1971 it was officially closed by the state of Pennsylvania. Over the course of its 142 years, the penitentiary held some 75,000 inmates, including the gangster Al Capone. Declared a national historic landmark in 1965, the prison was opened as a historic site in 1994. Today tourists, and not criminals, walk beneath the vaulted ceilings and skylights of the neo-Gothic building that once represented the moral ambitions of America's founding fathers.
9eb116c9cc3660d0a53beb78dbc11658
https://www.smithsonianmag.com/history/edgar-allan-poe-tried-and-failed-to-crack-the-mysterious-murder-case-of-mary-rogers-7493607/?no-ist
Edgar Allan Poe Tried and Failed to Crack the Mysterious Murder Case of Mary Rogers
Edgar Allan Poe Tried and Failed to Crack the Mysterious Murder Case of Mary Rogers She moved amid the bland perfume That breathes of heaven’s balmiest isle; Her eyes had starlight’s azure gloom And a glimpse of heaven–her smile. New York Herald, 1838 John Anderson’s Liberty Street cigar shop was no different from the dozens of other tobacco emporiums frequented by the newspapermen of New York City. There only reason it was so crowded was Mary Rogers. Mary was the teenage daughter of a widowed boarding-house keeper, and her beauty was the stuff of legend. A poem dedicated to her visage appeared in the New York Herald, and during her time clerking at John Anderson’s shop she bestowed her heavenly smile upon writers like James Fenimore Cooper and Washington Irving, who would visit to smoke and flirt during breaks from their offices nearby. In 1838, the cigar girl with ”the dainty figure and pretty face” went out and failed to return. Her mother discovered what appeared to be a suicide note; the New York Sun reported that the coroner had examined the letter and concluded the author had a “fixed and unalterable determination to destroy herself.” But a few days later Mary returned home, alive and well. She had been, it turned out, visiting a friend in Brooklyn. The Sun, which three years earlier had been responsible for the Great Moon Hoax, was accused of manufacturing Mary’s disappearance to sell newspapers. Her boss, John Anderson, was suspected of being in on the scheme, for after Mary returned his shop was busier than ever. Still, the affair blew over, and Mary settled back into her role as an object of admiration for New York’s literary set. By 1841 she was engaged to Daniel Payne, a cork-cutter and boarder in her mother’s house. On Sunday, July 25, Mary announced plans to visit relatives in New Jersey and told Payne and her mother she’d be back the next day. The night Mary ventured out, a severe storm hit New York, and when Mary failed to return the next morning, her mother assumed she’d gotten caught in bad weather and delayed her trip home. By Monday night, Mary still hadn’t come back, and her mother was concerned enough to place an advertisement in the following day’s Sun asking for anyone who might have seen Mary to have the girl contact her, as “it is supposed some accident has befallen her.” Foul play was not suspected. On July 28, some men were out for a stroll near Sybil’s Cave, a bucolic Hudson riverside spot in Hoboken, New Jersey, when a bobbing figure caught their attention. Rowing out in a small boat, they dragged what turned out to be the body of a young woman back to shore. Crowds gathered, and within hours, a former fiancee of Mary’s identified the body as hers. According to the coroner, her dress and hat were torn and her body looked as though it had taken a beating. She was also, the coroner took care to note, not pregnant, and “had evidently been a person of chastity and correct habits.” Questions abounded: Had Mary been killed by someone she knew? Had she been a victim of a random crime of opportunity, something New Yorkers increasingly worried about as the city grew and young women strayed farther and farther from the family parlor? Why hadn’t the police of New York or Hoboken spotted Mary and her attacker? The Herald, the Sun and the Tribune all put Mary on their front pages, and no detail was too lurid—graphic descriptions of Mary’s body appeared in each paper, along with vivid theories about what her killer or killers might have done to her. More than anything, they demanded answers. Suspicion fell immediately upon Daniel Payne, Mary’s fiancee; perhaps one or the other had threatened to leave, and Payne killed her, either to get rid of her or to prevent her from breaking their engagement. He produced an airtight alibi for his whereabouts during Mary’s disappearance, but that didn’t stop the New-Yorker (a publication unrelated to the current magazine of that name) from suggesting, in August of 1841, that he’d had a hand in Mary’s death: There is one point in Mr. Payne’s testimony which is worthy of remark. It seems he had been searching for Miss Rogers—his betrothed—two or three days; yet when he was informed on Wednesday evening that her body had been found at Hoboken, he did not go to see it or inquire into the matter—in fact, it appears that he never went at all, though he had been there inquiring for her before. This is odd, and should be explained. If Payne hadn’t killed Mary, it was theorized, she’d been caught by a gang of criminals. This idea was given further credence later that August, when two Hoboken boys who were out in the woods collecting sassafras for their mother, tavern owner Frederica Loss, happened upon several items of women’s clothing. The Herald reported that “the clothes had all evidently been there at least three or four weeks. They were all mildewed down hard…the grass had grown around and over some of them. The scarf and the petticoat were crumpled up as if in a struggle.” The most suggestive item was a handkerchief embroidered with the initials M.R. The discovery of the clothes catapulted Loss into minor celebrity. She spoke with reporters at length about Mary, whom she claimed to have seen in the company of a tall, dark stranger on the evening of July 25. The two had ordered lemonade and then taken their leave from Loss’ tavern. Later that night, she said, she heard a scream coming from the woods. At the time, she’d thought it was one of her sons, but after going out to investigate and finding her boy safely inside, she’d decided it must have been an animal. In light of the clothing discovery so close to her tavern, though, she now felt certain it had come from Mary. The Herald and other papers took this as evidence that strangers had indeed absconded with Mary, but despite weeks of breathless speculation, no further clues were found and no suspects identified. The city moved on, and Mary’s story became yesterday’s news—only to return to the headlines. In October 1841, Daniel Payne went on a drinking binge that carried him to Hoboken. After spending October 7 going from tavern to tavern to tavern, he entered a pharmacy and bought a vial of laudanum. He stumbled down to where Mary’s body had been brought to shore, collapsed onto a bench, and died, leaving behind a note: “To the World—Here I am on the very spot. May God forgive me for my misspent life.” The consensus was that his heart had been broken. While the newspapers had their way with Mary’s life and death, Edgar Allen Poe turned to fact-based fiction to make sense of the case. Working in the spring of 1842, Edgar Allan Poe transported Mary’s tale to Paris and, in “The Mystery of Marie Rogêt,” gave her a slightly more Francophone name (and a job in a perfume shop), but the details otherwise match exactly. The opening of Poe’s story makes his intent clear: The extraordinary details which I am now called upon to make public, will be found to form, as regards sequence of time, the primary branch of a series of scarcely intelligible coincidences, whose secondary or concluding branch will be recognized by all readers in the late murder of MARY CECILIA ROGERS, at New York. A sequel to “The Murders in the Rue Morgue,” widely considered the first detective story ever set to print, “The Mystery of Marie Rogêt” would see the detective Dupin solve the young woman’s murder. In shopping the story to editors, Poe suggested he’d gone beyond mere storytelling: “Under the pretense of showing how Dupin unraveled the mystery of Marie’s assassination, I, in fact, enter into a very rigorous analysis of the real tragedy in New York.” Though he appropriated the details of Mary’s story, Poe still faced the very real challenge of actually solving the murder when the police were no closer than they’d been in July 1841. Like many other stories of the mid-19th century, “The Mystery of Marie Rogêt” was serialized, appearing in November issues of Snowden’s Ladies Companion. The third part, in which Dupin put together the details of the crime but left the identity of the criminal up in the air, was to appear at the end of the month, but a shocking piece of news delayed the final installment. In October 1842, Frederica Loss was accidentally shot by one of her sons and made a deathbed confession regarding Mary Rogers. The “tall, dark” man she’d seen the girl with in July 1841 had not been a stranger; she knew him. The Tribune reported: “On the Sunday of Miss Rogers’s disappearance she came to her house from this city in company with a young physician, who undertook to produce for her a premature delivery.” (“Premature delivery” being a euphemism for abortion.) The procedure had gone wrong, Loss said, and Mary had died. After disposing of her body in the river, one of Loss’ sons had thrown her clothes in a neighbor’s pond and then, after having second thoughts, scattered them in the woods. While Loss’ confession did not entirely match the evidence (there was still the matter of Mary’s body, which bore signs of some kind of struggle), the Tribune seemed satisfied: “Thus has this fearful mystery, which has struck fear and terror to so many hearts, been at last explained by circumstances in which no one can fail to perceive a Providential agency.” To some, the attribution of Mary’s death to a botched abortion made perfect sense—it had been suggested that she and Payne quarreled over an unwanted pregnancy, and in the early 1840s New York City was fervently debating the activities of the abortionist Madame Restell. Several penny presses had linked Rogers to Restell (and suggested that her 1838 disappearance lasted precisely as long as it would take a woman to terminate a pregnancy in secret and return undiscovered), and while that connection was ultimately unsubstantiated, Mary was on the minds of New Yorkers when, in 1845, they officially criminalized the procedure. Poe’s story was considered a sorry follow up to “The Murders in the Rue Morgue,” but he did manage to work Loss’ story into his narrative. His Marie Rogêt had indeed kept company with a “swarthy naval officer” who may very well have killed her, though by what means we are not sure—did he murder her outright or lead her into a “fatal accident,” a plan of “concealment”? Officially, the death of Mary Rogers remains unsolved. Poe’s account remains the most widely read, and his hints at abortion (made even clearer in an 1845 reprinting of the story, though the word “abortion” never appears) have, for most, closed the case. Still, those looking for Poe to put the Mary Rogers case to rest are left to their own devices. In a letter to a friend, Poe wrote: “Nothing was omitted in Marie Rogêt but what I omitted myself—all that is mystification.” Sources: Poe, Edgar Allan, “The Mystery of Marie Rogêt”; “The Mary Rogers Mystery Explained”, New-York Daily Tribune, Nov. 18, 1842; “The Case of Mary C. Rogers”, The New-Yorker; Aug. 14, 1841; Stashower, Daniel, The Beautiful Cigar Girl (PenguinBooks, 2006); Srebnick, Amy Gilman, The Mysterious Death of Mary Rogers: Sex and Culture in Nineteenth Century New York (Oxford University Press, 1995); Meyers, Jeffrey, Edgar Allan Poe: His Life and Legacy (Cooper Square Press, 1992) Angela Serratore is a writer and a contributing editor at Smithsonian.com
620b5ee33b2262c08d276b8049208000
https://www.smithsonianmag.com/history/edward-curtis-epic-project-to-photograph-native-americans-162523282/
Edward Curtis’ Epic Project to Photograph Native Americans
Edward Curtis’ Epic Project to Photograph Native Americans Year after year, he packed his camera and supplies—everything he’d need for months—and traveled by foot and by horse deep into the Indian territories. At the beginning of the 20th century, Edward S. Curtis worked in the belief that he was in a desperate race against time to document, with film, sound and scholarship, the North American Indian before white expansion and the federal government destroyed what remained of their natives’ way of life.  For thirty years, with the backing of men like J. Pierpont Morgan and former president Theodore Roosevelt, but at great expense to his family life and his health, Curtis lived among dozens of native tribes, devoting his life to his calling until he produced a definitive and unparalleled work, The North American Indian. The New York Herald hailed as “the most ambitious enterprise in publishing since the production of the King James Bible.” Born in Wisconsin in 1868, Edward Sheriff Curtis took to photography at an early age.  By age 17, he was an apprentice at a studio in St. Paul, Minnesota, and his life seemed to be taking a familiar course for a young man with a marketable trade, until the Curtis family packed up and moved west, eventually settling in Seattle.  There, Curtis married 18-year-old Clara Phillips, purchased his own camera and a share in a local photography studio, and in 1893, the young couple welcomed a son, Harold—the first of their four children. The young family lived above the thriving Curtis Studio, which attracted society ladies who wanted their portraits taken by the handsome, athletic young man who made them look both glamorous and sophisticated. And it was in Seattle in 1895 where Curtis did his first portrait of a Native American—that of Princess Angeline, the eldest daughter of Chief Sealth of the Duwamish tribe. He paid her a dollar for each pose and noted, “This seemed to please her greatly, and with hands and jargon she indicated that she preferred to spend her time having pictures made than in digging clams.” Yet it was a chance meeting in 1898 that set Curtis on the path away from his studio and his family. He was photographing Mt. Rainier when he came upon a group of prominent scientists who’d become lost; among the group was the anthropologist George Bird Grinnell, an expert on Native American cultures. Curtis quickly befriended him, and the relationship led to the young photographer’s appointment as official photographer for the Harriman Alaska Expedition of 1899, led by the railroad magnate Edward H. Harriman and including included the naturalist John Muir and the zoologist C. Hart Merriam. For two months, Curtis accompanied two dozen scientists, photographing everything from glaciers to Eskimo settlements. When Grinnell asked him to come on a visit to the Piegan Blackfeet in Montana the following year, Curtis did not hesitate. It was in Montana, under Grinnell’s tutelage, that Curtis became deeply moved by what he called the “primitive customs and traditions” of the Piegan people, including the “mystifying” Sun Dance he had witnessed. “It was at the start of my concerted effort to learn about the Plains Indians and to photograph their lives,” Curtis wrote, “and I was intensely affected.” When he returned to Seattle, he mounted popular exhibitions of his Native American work, publishing magazine articles and then lecturing across the country. His photographs became known for their sheer beauty. President Theodore Roosevelt commissioned Curtis to photograph his daughter’s wedding and to do some Roosevelt family portraits. But Curtis was burning to return to the West and seek out more Native Americans to document. He found a photographer to manage his studio in Seattle, but more important, he found a financial backer with the funds for a project of the scale he had in mind. In 1906 he boldly approached J.P. Morgan, who quickly dismissed him with a note that read, “Mr. Curtis, there are many demands on me for financial assistance. I will be unable to help you.” But Curtis persisted, and Morgan was ultimately awed by the photographer’s work. “Mr. Curtis,” Morgan wrote after seeing his images, “I want to see these photographs in books—the most beautiful set of books ever published.” Morgan agreed to sponsor Curtis, paying out $75,000 over five years in exchange for 25 sets of volumes and 500 original prints. It was enough for Curtis to acquire the necessary equipment and hire interpreters and researchers. With a trail wagon and assistants traveling ahead to arrange visits, Edward Curtis set out on a journey that would see him photograph the most important Native Americans of the time, including Geronimo, Red Cloud, Medicine Crow and Chief Joseph. The trips were not without peril—impassable roads, disease and mechanical failures; Arctic gales and the stifling heat of the Mohave Desert; encounters with suspicious and “unfriendly warriors.” But Curtis managed to endear himself to the people with whom he stayed. He worked under the premise, he later said, of “We, not you. In other words, I worked with them, not at them.” On wax cylinders, his crew collected more than 10,000 recordings of songs, music and speech in more than 80 tribes, most with their own language. To the amusement of tribal elders, and sometimes for a fee, Curtis was given permission to organize reenactments of battles and traditional ceremonies among the Indians, and he documented them with his hulking 14-inch-by-17-inch view camera, which produced glass-plate negatives that yielded the crisp, detailed and gorgeous gold-tone prints he was noted for. The Native Americans came to trust him and ultimately named him “Shadow Catcher,” but Curtis would later note that, given his grueling travel and work, he should have been known as “The Man Who Never Took Time to Play.” Just as Curtis began to produce volume after volume of The North American Indian, to high acclaim, J.P. Morgan died unexpectedly in Egypt in 1913. J.P. Morgan Jr. contributed to Curtis’s work, but in much smaller sums, and the photographer was forced to abandon his field work for lack of funding. His family life began to suffer—something Curtis tried to rectify on occasion by bringing Clara and their children along on his travels. But when his son, Harold nearly died of typhoid in Montana, his wife vowed never to travel with him again. In 1916, she filed for divorce, and in a bitter settlement was awarded the Curtis family home and the studio. Rather than allow his ex-wife to profit from his Native American work, Edward and his daughter Beth made copies of certain glass plate negatives, then destroyed the originals. While the onset of World War I coincided with a diminishing interest in Native American culture, Curtis scraped together enough funding in an attempt to strike it big with a motion picture, In the Land of the Head-Hunters, for which he paid Kwakiutl men on Vancouver Island to replicate the appearance of their forefathers by shaving off facial hair and donning wigs and fake nose rings. The film had some critical success but flopped financially, and Curtis lost his $75,000 investment. He took work in Hollywood, where his friend Cecil B. DeMille hired him for camerawork on films such as The Ten Commandments. Curtis sold the rights to his movie to the American Museum of Natural History for a mere $1,500 and worked out a deal that allowed him to return to his field work—by relinquishing his copyright on the images for The North American Indian to the Morgan Company. The tribes Curtis visited in the late 1920s, he was alarmed to find, had been decimated by relocation and assimilation. He found it more difficult than ever to create the kinds of photographs he had in the past, and the public had long ceased caring about Native American culture. When he returned to Seattle, his ex-wife had him arrested for failing to pay alimony and child support, and the stock market crash of 1929 made it nearly impossible for him to sell any of his work. By 1930, Edward Curtis had published, to barely any fanfare, the last of his planned 20-volume set of The North American Indian, after taking more than 40,000 pictures over 30 years. Yet he was ruined, and he suffered a complete mental and physical breakdown, requiring hospitalization in Colorado. The Morgan Company sold 19 complete sets of The North American Indian, along with thousands of prints and copper plates, to Charles Lauriat Books of Boston, Massachusetts for just $1,000 and a percentage of future royalties. Once Curtis sufficiently recovered his mental health, he tried to write his memoirs, but never saw them published.  He died of a heart attack in California in 1952 at the age of 84. A small obituary in the New York Times noted his research “compiling Indian history” under the patronage of J.P. Morgan and closed with the sentence, “Mr. Curtis was also widely known as a photographer.” The photographs of Edward Curtis represent ideals and imagery designed to create a timeless vision of Native American culture at a time when modern amenities and American expansion had already irrevocably altered the Indian way of life. By the time Curtis had arrived in various tribal territories, the U.S. government had forced Indian children into boarding schools, banned them from speaking in their native tongues, and made them cut their hair. This was not what Curtis chose to document, and he went to great pains to create images of Native Americans posing in traditional clothing they had long since put away, in scenes that were sometimes later retouched by Curtis and his assistants to eliminate any modern artifacts, such as the presence of a clock in his image, In a Piegan Lodge. Some critics have accused him of photographic fakery—of advancing his career by ignoring the plight and torment of his subjects. Others laud him, noting that he was, according to the Bruce Kapson Gallery, which represents Curtis’s work, “able to convey a dignity, universal humanity and majesty that transcend literally all other work ever done on the subject.” It is estimated that producing The North American Indian today would cost more than $35 million. “When judged by the standards of his time,” Laurie Lawlor wrote in her book, Shadow Catcher: The Life and Work of Edward S. Curtis, “Curtis was far ahead of his contemporaries in sensitivity, tolerance and openness to Native American cultures and ways of thinking.  He sought to observe and understand by going directly into the field.” Sources Books: Laurie Lawlor, Shadow Catcher: The Life and Work of Edward S. Curtis, Bison Books, 2005. Mick Gidley, Edward S. Curtis and the North American Indian, Incorporated, Cambridge University Press, 2000. Articles: “Edward Curtis: Pictorialist and Ethnographic Adventurist,” by Gerald Vizener, Essay based on author’s presentation at an Edward Curtis seminar at the Claremont Graduate University, October 6-7, 2000.  http://memory.loc.gov/ammem/award98/ienhtml/essay3.html “Edward Curtis: Shadow Catcher,” by George Horse Capture, American Masters, April 23, 2001.  http://www.pbs.org/wnet/americanmasters/episodes/edward-curtis/shadow-catcher/568/ “The Impoerfect Eye of Edward Curtis,” by Pedro Ponce, Humanities, May/June 2000, Volume 21/Number 3. http://www.neh.gov/news/humanities/2000-05/curtis.html “Frontier Photographer Edward S. Curtis,” A Smithsonian Institution Libraries Exhibition. http://www.sil.si.edu/Exhibitions/Curtis/index.htm “Selling the North American Indian: The Work of Edward Curtis,” Created by Valerie Daniels, June 2002, http://xroads.virginia.edu/~ma02/daniels/curtis/promoting.html “Edward S. Curtis and The North American Indian: A detailed chronological biography,” Eric J. Keller/Soulcatcher Studio, http://www.soulcatcherstudio.com/artists/curtis_cron.html “Edward S. Curtis (1868-1952) and The North American Indian,” by Mick Gidley, Essay from The North American Indian, The Vanishing Race: Selections from Edward S. Curtis’ The North American Indian,” (Newton Abbot: David and Charles, 1976 New York: Taplinger, 1977.) http://memory.loc.gov/ammem/award98/ienhtml/essay1.html Gilbert King is a contributing writer in history for Smithsonian.com. His book Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America won the Pulitzer Prize in 2013.
a53aa8a3729b96af078e2023507be83f
https://www.smithsonianmag.com/history/electronic-cigarettes-millennial-appeal-ushers-next-generation-nicotine-addicts-180968747/
Ads for E-Cigarettes Today Hearken Back to the Banned Tricks of Big Tobacco
Ads for E-Cigarettes Today Hearken Back to the Banned Tricks of Big Tobacco Robert Jackler has spent more than a decade researching the history of advertisements produced by Big Tobacco. The ear, nose and throat surgeon from Stanford University has built a diverse collection of more than 50,000 advertisements from magazines, newspapers, billboards, television and the internet from the turn of the 20th century to the present day. The collection, archived within the Smithsonian’s National Museum of American History, highlights the tobacco industry’s efforts to deceive the public about the health risks of their products. Altogether, they reveal the depths to which cigarette companies would stoop to entice Americans to smoke. Central themes of Jackler’s collection include faux medical imagery and exaggerated health claims, posh cultural icons and celebrity endorsements, and the explicit targeting of youth populations with cartoon characters, sweet flavors and promises of elevated social status. Now, decades after the U.S. government began regulating tobacco advertising, these techniques are being diverted toward getting young people to try e-cigarettes. Throughout the late 20th century, the federal government tightened regulations on the placement and content of traditional tobacco advertisements, largely limiting their exposure to children. The first of these regulations came when Congress passed the Public Health Cigarette Smoking Act of 1970 to ban the advertising of cigarettes on television and radio, following the landmark 1964 Surgeon General report that causally linked lung cancer and chronic bronchitis to smoking. Other efforts from Big Tobacco to target children were eventually stymied by the government as well. In the late '90s, the Federal Trade Commission banned the indelible Joe Camel, and the Food and Drug Administration banned kid-friendly flavors like strawberry, grape, and chocolate from traditional, or combustible, cigarettes in 2009. But in the early 2000s, emerging companies promulgated a new way of getting hooked on nicotine: electronic cigarettes, more commonly known as e-cigarettes. Jackler has close to 13,000 items in his collection pertaining to this recent fad, and his research has revealed troubling similarities between the campaigns of old and the practices used today. "[E-cigarette producers] ignore absolutely everything that was ever agreed to around combustible cigarettes,” says Jackler. His collection of e-cigarette ads is rife with such misleading and targeted messages that hawk everything from pseudoscientific health claims to kid-friendly bubblegum flavors and “back to school” sales. “You have pictures of doctors saying, ‘Use this e-cigarette.’ You have all sorts of claims in e-cigarettes that are the kinds of things that would have been forbidden. E-cigarettes show up on television and radio,” he continued. Calling the industry an “unregulated Wild West,” Jackler bemoans the familiarity of the techniques he sees in the marketplace. Take the San Francisco e-cigarette startup JUUL, to name one, which advertises “delicious” flavors that promise to “deliver a vapor experience like no other,” all in the service of a lofty mission of helping adults to quit smoking. E-cigarettes’ inroads in disrupting the traditional tobacco industry seemingly would be good news to anti-smoking campaigners, and startups like JUUL capitalize on this perception. They proclaim on their website that they are “driving innovation to eliminate cigarettes.” However, Jackler and others argue that that e-cigarette companies’ marketing campaigns carry much more appeal to adolescents ­– most of whom never may have considered smoking traditional cigarettes, and haven’t been subjected to heavy cigarette marketing thanks to new regulations. With bright colors, sleek design and fashionable millennial models, advertisements for JUUL’s high-nicotine product could easily be promoting the newest smartphone line. “Very clearly, they do the same damn thing today as they did then. The messaging is very subtle, very carefully crafted. They target, in the same way, adolescents,” says Jackler. (UPDATE, 4/13/18: JUUL submitted the following statement via e-mail: "It is absolutely false that Juul markets to anyone other than adult smokers. We could not be more emphatic on this point: Our product is only intended for adult smokers. No young person, and no adult who is already not a smoker, should use our product or any nicotine product. All of our marketing reflects that position.") Conventional cigarette use among teens has been on a steady decline for the past 20 years, dipping below six percent for high school seniors in 2015. Now, e-cigarettes have begun to take their place. According to a 2016 report by the United States Surgeon General, 16 percent of high school students had used e-cigarettes in the past 30 days and 40 percent had tried them at least once. While e-cigarettes are advertised as a less-harmful alternative to their combustible counterparts, studies have linked e-cigarettes to a host of health issues, including asthma and chronic bronchitis. Perhaps more troubling, they also can act as a gateway to traditional smoking: studies find adolescents who use e-cigarettes are 20 percent more likely to take up traditional cigarettes in the future. At least one fMRI brain imaging study supports Jackler’s claims that e-cigarette ads appeal particularly to kids. Research by Yvonnes Chen at the University of Kansas suggests that some e-cigarette advertisements may trigger high levels of activation in the reward centers of adolescent brains– even for those who had never smoked. Themes of rebelliousness, sex appeal, and kid-friendly flavors abounded in the e-cigarette ads used in the study, which Chen says likely explains the adolescents’ heightened neural and behavioral responses. “If you take a look at these categories, these have been traditionally used by tobacco companies when they were trying to market the combustible tobacco products,” Chen says. “The appeals are very consistent throughout the decades… and clearly, these are the traits that are traditionally appealing to adolescents and even children.” Many studies have shown that, for adult smokers, merely watching videos featuring tobacco products activates the reward center in the brain in the same way that physically smoking a cigarette would. It’s a pernicious effect that tends to result in more intense cravings for cigarettes, thus reinforcing the vicious circle of nicotine addiction. But it’s an effect that, as of yet, only had been demonstrated in routine smokers. For their recent study, Chen’s team looked at the same reward centers in a group of 30 non-smoking 14 to 21-year-old participants while they were shown e-cigarette advertisements in an fMRI machine (as compared with control ads). In addition to the neural activity, the adolescents expressed a stronger desire to use e-cigarettes than the other products after exposure to the advertisements, suggesting that youth appeal may be even stronger than researchers had expected. “These ads are designed to appeal to users who are not of age… and we know based on animal models that adolescents’ brains are already so much more susceptible to nicotine,” she said. “There definitely are a lot of consequences socially with that. E-cigarette companies have a bigger role to play in terms of being more responsible citizens.” These advertisements are not only especially appealing to young people, but they’re quickly becoming ubiquitous among them. The Centers for Disease Control recently reported that 69 percent of middle and high school students had seen e-cigarette ads, with most of this exposure occurring in retail stores and on the internet. According to Jackler, far from being a win for anti-tobacco groups, the e-cigarette industry’s youth appeal and outpacing of regulatory activity could be a win-win for Big Tobacco. Citing Lorillard Tobacco Company’s recent acquisition of the e-cigarette company Blu, he foresees a marketplace where the major tobacco companies swallow up their more youthful competitors. The e-cigarettes’ adolescent customer base might then be Big Tobacco’s next generation of combustible cigarette smokers. “Smoking initiation is an adolescent thing… they smoke, they get hooked on the nicotine, and they become lifelong consumers,” says Jackler. Editor's note, April 13, 2018: This story has been updated to include a statement from JUUL.
50dabc1b366da7f34f755ca28f5b6a8f
https://www.smithsonianmag.com/history/emperor-wang-mang-chinas-first-socialist-2402977/?no-ist
Emperor Wang Mang: China’s First Socialist?
Emperor Wang Mang: China’s First Socialist? October 7, 23 A.D. The imperial Chinese army, 420,000 strong, has been utterly defeated. Nine “Tiger Generals,” sent to lead a corps of 10,000 elite soldiers, have been swept aside as rebel forces close in. The last available troops—convicts released from the local jails—have fled. Three days ago, rebels breached the defenses of China’s great capital, Chang’an; now, after some bloody fighting, they are scaling the walls of the emperor’s private compound. Deep within his Endless Palace, Emperor Wang Mang waits for death. For 20 years, ever since he first contemplated the overthrow of the dissolute remnants of the the Han Dynasty, the usurper Wang had driven himself to keep to an inhuman schedule, working through the night and sleeping at his desk as he labored to transform China. When the rebellion against him gained strength, however, Wang appeared to give up. He retreated to his palace and summoned magicians with whom he passed his time testing spells; he began to assign strange, mystical titles to his army commanders: “The Colonel Holding a Great Axe to Chop Down Withered Wood” was one. Such excesses seemed out of character for Wang, a Confucian scholar and renowned ascetic. The numismatist Rob Tye, who has made a study of the emperor’s reign, believes that he succumbed to despair. “Frankly, my own assessment is that he was high on drugs for most of the period,” Tye writes. “Knowing all was lost, he chose to escape reality, seeking a few last weeks of pleasure.” When the rebels broke into his palace, Wang was in the imperial harem, surrounded by his three Harmonious Ladies, nine official wives, 27 handpicked “beauties” and their 81 attendants. He had dyed his white hair in order to look calm and youthful. Desperate officials persuaded him to retire with them to a high tower surrounded by water in the center of the capital. There, a thousand loyalists made a last stand before the armies of the revived Han, retreating step by step up twisting stairs until the emperor was cornered on the highest floor. Wang was slain late in the afternoon, his head severed, his body torn to pieces by soldiers seeking mementos, his tongue cut out and eaten by an enemy. Did he wonder, as he died, how it had come to this—how his attempts at reform had inflamed a whole nation? And did it strike him as ironic that the peasants he had tried to help—with a program so seemingly radical that some scholars describe it as socialist, even “communistic”—had been the first to turn against him? Wang Mang may be the most controversial of China’s hundred or more emperors. Born into one of his country’s oldest noble families in about 45 B.C., he was celebrated first as a scholar, then as an ascetic and finally as regent for a succession of young and short-lived emperors. Finally, in 9 A.D., with the death (many believe the murder) of the last of these infant rulers, Wang seized the throne for himself. His usurpation marked the end of the Former Han Dynasty, which had reigned since 206 B.C.–shortly after the death of China’s renowned First Emperor, builder of the Great Wall and the celebrated Terracotta Army. In the Han’s place, Wang proclaimed the Xin—”new”—dynasty, of which he was destined to remain the solitary emperor. The 14 years of Wang Mang’s reign can be divided into two parts: eight years of dramatic reform followed by six of escalating rebellion. The first period witnessed attempts to overhaul the entire system of imperial government, though whether the emperor intended to return China to the days of the semi-legendary Zhou Dynasty, which had ruled China before the Han, or introduce radical new policies of his own, remains hotly disputed. The second period witnessed the upheaval known as the Red Eyebrow Rebellion (an attempt by desperate and essentially conservative peasants to reverse some of Wang’s riskier reforms), the resurgence of the Han and the deaths of an estimated 25 million people—perhaps half the total Chinese population at that time. Any attempt to assess Wang’s reign is beset with difficulties. Usurpers rarely enjoy a good press, but China has always treated its rebel rulers rather differently. In imperial times, it was believed that all emperors ruled thanks to the “mandate of heaven,” and hence were themselves the Sons of Heaven, practically divine. It was, however, perfectly possibly to lose this mandate. Portents such as comets and natural disasters could be interpreted as heaven’s warning to a ruler to mend his ways; any emperor who subsequently lost his throne in an uprising was understood to have forfeited heaven’s approval. At that point, he became illegitimate and his successor, no matter how humble his origins, assumed the mantle of Son of Heaven. From the point of view of Chinese historiography, however, emperors who lost their thrones had never been legitimate to begin with, and their histories would be written with a view to demonstrating just how lacking in the necessary virtues they had always been. Wang Mang provoked a devastating civil war that ended with a large proportion of his empire in arms against him. Because of this, the historian Clyde Sargent stresses, he “traditionally has been considered as one of the greatest tyrants and despots in Chinese history.” No line of the official account of his reign views his policies as justified or positive. Even its description of his features reflects bias; as Hans Bielenstein observes, Wang “is described as having a large mouth and a receding chin, bulging eyes with brilliant pupils, and a loud voice which was hoarse.” More recently, however, Wang Mang has undergone a startling reappraisal. This process can be dated to 1928 and the publication of a study by Hu Shih, a renowned scholar who was then the Chinese ambassador to the United States. In Hu’s view, it was the Han Dynasty that most richly deserved condemnation, for having produced “a long line of degenerate scions.” Wang Mang, on the other hand, lived simply, thought deeply and was “the first man to win the empire without an armed revolution.” Moreover, Wang then nationalized his empire’s land, distributed it equally to his subjects, cut land taxes from 50 percent to 10, and was, all in all, “frankly communistic”—a remark Hu intended as a compliment. Hu Shih’s portrayal of Wang Mang has been hotly disputed since he wrote it, and understanding what the emperor really thought, or intended, during his reign is rendered all but impossible by the scarcity of sources. With the exception of a few coins and a handful of archaeological remains, all that is known of Wang is contained in his official biography, which appears as Chapter 99 of the History of the Han Dynasty, compiled shortly before 100 A.D. This is quite a lengthy document—the longest of all the imperial biographies that survive from this period—but by its very nature it is implacably opposed to the usurper-emperor. To make matters worse, while the History says a good deal about what Wang did, it tells us very little about why he did it. In particular, it displays no real interest in his economic policies. The little that is known about Wang Mang’s reforms can be summarized as follows. It is said he invented an early form of social security payments, collecting taxes from the wealthy to make loans to the traditionally uncreditworthy poor. He certainly introduced the “six controls”—government monopolies on key products such as iron and salt that Hu Shih saw as a form of “state socialism”—and was responsible for a policy known as the Five Equalizations, an elaborate attempt to damp down fluctuations in prices. Even Wang’s harshest modern critics agree that his ban on the sale of cultivated land was an attempt to save desperate farmers from the temptation to sell up during times of famine; instead, his state provided disaster relief. Later the emperor imposed a ruinous tax upon slave owners. It is equally possible to interpret this tax as either an attempt to make slaveholding impossible or as a naked grab for money. Of all Wang Mang’s policies, however, two stand out: his land reforms and the changes he made to China’s money. As early as 6 A.D., when he was still merely regent for an infant named Liu Ying, Wang ordered the withdrawal of the empire’s gold-based coins and their replacement with four bronze denominations of purely nominal value—round coins with values of one and 50 cash and larger, knife-shaped coins worth 500 and 5,000 cash. Since Wang’s 50-cash coins had only 1/20th the bronze per cash as his smallest coins did, and his 5,000-cash coins were minted with proportionally even less, the effect was to substitute fiduciary currency for a Han dynasty gold standard. Simultaneously, Wang ordered the recall of all the gold in the empire. Thousands of tons of the precious metal were seized and stored in the imperial treasury, and the dramatic decrease in its availability was felt as far away as Rome, where the Emperor Augustus was forced to ban the purchase of expensive imported silks with what had become—mysteriously, from the Roman point of view—irreplaceable gold coins. In China, the new bronze coinage produced rampant inflation and a sharp increase in counterfeiting. Wang Mang’s land reforms, meanwhile, appear even more consciously revolutionary. “The strong,” Wang wrote, “possess lands by the thousands of mu , while the weak have nowhere to place a needle.” His solution was to nationalize all land, confiscating the estates of all those who possessed more than 100 acres, and to distribute it to those who actually farmed it. Under this, the so-called ching system, each family received about five acres and paid the state tax in the form of 10 percent of all the food they grew. Historians are divided as to Wang Mang’s intentions. Several, led by Bielenstein, suggest that catastrophic changes in the course of the Yellow River took place during his regency period, resulting in famine, drought and flood; if this is true, it can certainly be argued that Wang spent his entire reign battling forces that he could not possibly control. But the majority of modern accounts of Wang’s reign see him as a Confucian, not a communist. Bielenstein, in his contribution to the imposing Cambridge History of China, says this, though he chooses to ignore some of the more contentious issues. And while Clyde Sargent (who translated the History of the Han Dynasty) acknowledges the “startling modernity” of the emperor’s ideas, he adds that there is insufficient evidence to prove he was a revolutionary. For Oxford University’s Homer Dubs, author of the standard account of Wang’s economic policies, the emperor’s new coins were issued in conscious imitation of an ancient tradition, dating to the Warring States period, of circulating two denominations of bronze coins. Indeed, the emperor’s monetary policy, Dubs writes, can be viewed as a purely “Confucian practice, since a cardinal Confucian principle was the imitation of the ancient sages”; he also points out that the loans the emperor made available to “needy persons” came with a high interest rate, 3 percent per month. Moreover, few of the emperor’s most apparently socialist policies remained in force in the face of widespread protest and rebellion. “In the abolition of slavery and the restriction of land holdings,” Dubs writes, “Wang Mang undoubtedly hit upon a measure that would have benefited society, but these reforms were rescinded within two years.” For Dubs, the usurper’s policies have mundane origins. None, he argues, was truly revolutionary, or even original to Wang. Even the celebrated land reforms were the product of a Confucian tradition, “said to have been universal in Zhou times”—and were little more than “the dream of idealistic scholars,” since the five-acre parcels handed out to peasant families were too small to make practical farms. (According to the contemporary imperial historian Ban Gu, 10 or 15 acres was the minimum needed to support a family.) Others argue that the emperor really did have radical ideas. Tye joins Hu Shih in preferring this interpretation, commenting on the “astonishing breadth” of Wang Mang’s program, from “a national bank offering fair rates of interest to all” and a merit-based pay structure for bureaucrats to “strikingly pragmatic” taxes—among them what amounted to the world’s first income tax. For Tye, the monetary expert, Wang’s fiscal reforms were intended to impoverish wealthy nobles and merchants, who were the only people in the empire to possess substantial quantities of gold. His bronze coins, in this interpretation, released the less-privileged (who owed money) from the curse of debt, while having practically no effect on a peasantry who lived by barter. Wang’s view of the economic chaos he created is similarly open to interpretation. We know that, even at the height of the rebellion against him, the emperor refused to release precious metal from his treasury, and that after he was overthrown, the imperial vaults were found to contain 333,000 pounds of gold. For Dubs, this refusal suggests merely that Wang Mang was “miserly.” For Hu Shih, Wang remained noble to the last, refusing to reverse his policies in a clearly doomed attempt to save his government. The last word may be left to the emperor himself. Writing with Confucian modesty in the years before his rise to power, Wang observed: When I meet with other nobles to discuss things face-to-face, I am awkward and embarrassed. By nature I am stupid and vulgar, but I have a sincere knowledge of myself. My virtue is slight, but my position is honorable. My ability is feeble, but my responsibilities are great. Sources Mary Anderson. Hidden Power: The Palace Eunuchs of Imperial China. Amherst : Prometheus Books, 1990; Hans Bielenstein. “Wang Mang, the restoration of the Han dynasty, and Later Han” in The Cambridge History of China vol.1. Cambridge: CUP, 1987; Hans Bielenstein. “Pan Ku’s accusations against Wang Mang” in Charles Le Blanc & Susan Blader (eds),Chinese Ideas About Nature and Society: Essays in Honour of Derk Bodde. Hong Kong: Hong Kong University Press, 1987; Homer Dubs. “Wang Mang and his economic reforms.” In T’oung Pao, 2nd series, 35 (1944); Hu Shih. “Wang Mang, the socialist emperor of nineteen centuries ago.” In Journal of the North-China Branch of the Royal Asiatic Society LIX (1928); Michael Loewe. “Wang Mang and his forebears: the making of the myth.” In T’oung Pao, 2nd series, 80 (1994); Clyde Bailey Sargent. Wang Mang: A Translation of the Official Account of His Rise to Power as Given in the “History of the Former Han Dynasty”. Shanghai: Graphic Art Book Co., 1947; Rob Tye. “Wang Mang,” Early World Coins, accessed November 12, 2011. Mike Dash is a contributing writer in history for Smithsonian.com. Before Smithsonian.com, Dash authored the award-winning blog A Blast From the Past.
9de6e9282756e0ee1cb85a0f8f236812
https://www.smithsonianmag.com/history/enrico-fermi-really-father-nuclear-age-180967214/
Was Enrico Fermi Really the “Father of the Nuclear Age”?
Was Enrico Fermi Really the “Father of the Nuclear Age”? Just over 75 years ago, physicist Enrico Fermi conducted a famous nuclear experiment beneath the University of Chicago’s football field on December 2, 1942. The experiment proved that chain reactions occur and could be used to release the energy of the uranium atom in a sustained way. It also cleared the way for the production of plutonium. A new book by David N. Schwartz, The Last Man Who Knew Everything: The Life and Times of Enrico Fermi, Father of the Nuclear Age, examines the scientist whose breakthrough 75 years ago this month changed the world. As the son of Nobel Prize winning-physicist Melvin Schwartz, what made you want to write a biography of Fermi? He was always a topic of conversation in my household. In 2013, my mom sent me a batch of papers from my father’s filing cabinet and one of them was an essay that a buddy of his had written about Fermi's years in Chicago. Oh, my heavens! What an amazing character. I said, "I'm going to go and pick up a biography of him." I checked out the library and the last biography of Enrico Fermi was in 1970. The world of physics really owes a huge amount to Fermi in a lot of different ways. So, I said, "Well, why not try writing a new biography that takes all of that into account." What research did you do for the book? My wife and I spent a month in Italy in the Fall of 2015 going through the University of Rome archives where Fermi taught for many years. We interviewed six or seven of his living students and colleagues – remarkable people who had amazing memories of their interactions with Fermi. We also went to the National Archives in College Park, Maryland, where I dug through a lot of material. I looked at his FBI file and his security background clearance files. Based on new archival material and exclusive interviews, The Last Man Who Knew Everything lays bare the enigmatic life of a colossus of twentieth century physics. Why did he go on to work on nuclear weapons? When the news came from Germany in January 1939 that the uranium atom had been split, physicists began to worry that a bomb could be made out of this. Then, at the end of the summer of 1939, the German physicist Werner Heisenberg came to visit. Fermi tried to persuade him to defect to the United States because, he said, “If you go back to Germany, you’ll be required to work on a nuclear weapon for the Nazis and that would be terrible.” Heisenberg said, “I owe my patriotic duty to my country. I’m not going to defect to the United States.” That really shook Fermi up and he decided to move ahead, because if the Germans beat the Americans to this, it would be an absolute disaster. Fermi was in Los Alamos when he overheard that the U.S. dropped atomic bombs on Hiroshima and Nagasaki. How did he react to this news? There's nothing recorded. His wife's reaction was probably not surprising. She was pleased that the war was over, proud of the role that her husband played, but also very saddened by the destruction and the threat that this kind of weapon would pose for future generations. What impact did his role in nuclear weapons have on his inner life? He never spoke about it. Never wrote about it. We don’t know what he thought about it. But after 1951, he never again worked for the government. Do you think him being known as “the father of the nuclear age” is apropos given his contributions? If you think the nuclear age began with the first sustained chain reaction, then he is the father of the nuclear age. There's no question about that. Is he the father of nuclear weapons? I think there are a lot of people who bear responsibility for that. J. Robert Oppenheimer, certainly, and Arthur Compton and Ernest Lawrence's contribution to the Manhattan Project [the U.S. government research project that produced the first atomic bombs] is immense. Lawrence invented the main processes for uranium enrichment. The project just simply wouldn't have happened without Oppenheimer. The nuclear age is a broader concept than just simply the nuclear bomb. The nuclear age is, in my view, the moment when man was able to master the process of releasing energy from the nucleus of the atom. Fermi was certainly the father of that. Why do you say Fermi was “the last man who knew everything”? He contributed to virtually every field of physics, from quantum physics to particle physics, from condensed matter physics to astrophysics. He even did geophysics! Because physics has since become so specialized, he was really the last man who could see all of physics as an integrated whole. What was he like? Fermi had an incredibly sunny personality and a great sense of humor. People who knew him fell in love with him. After he died, colleagues created an audio record called “To Fermi With Love.” You just don’t see that with other scientists. This article is a selection from the December issue of Smithsonian magazine Tacuma Roeback, a graduate student at Northwestern University, is an editorial intern with Smithsonian magazine.
9f2720e0c992a716c613cc72e2dc2c56
https://www.smithsonianmag.com/history/enslaved-girl-americas-first-poster-child-180971444/
The Enslaved Girl Who Became America’s First Poster Child
The Enslaved Girl Who Became America’s First Poster Child On February 19, 1855, Charles Sumner, the Massachusetts senator, wrote his supporters about an enslaved 7-year-old girl whose freedom he had helped to secure. She would be joining him onstage at an abolitionist lecture that spring. “I think her presence among us (in Boston) will be a great deal more effective than any speech I could make,” the noted orator wrote. He said her name was Mary, but he also referred to her, significantly, as “another Ida May.” Sumner enclosed a daguerreotype of Mary standing next to a small table with a notebook at her elbow. She is neatly outfitted in a plaid dress, with a solemn expression on her face, and looks for all the world like a white girl from a well-to-do family. When the Boston Telegraph published Sumner’s letter, it caused a sensation. Newspapers from Maine to Washington, D.C. picked up on the story of the “white slave from Virginia,” and paper copies of the daguerreotype were sold alongside a broadsheet promising the “History of Ida May.” The name referred to the title character of Ida May: A Story of Things Actual and Possible, a thrilling novel, published just three months earlier, about a white girl who was kidnapped on her fifth birthday, beaten unconscious and sold across state lines into slavery. The author, Mary Hayden Green Pike, was an abolitionist, and her tale was calculated to arouse white Northerners to oppose slavery and to resist the Fugitive Slave Act, the five-year-old federal law demanding that suspected slaves be returned to their masters. Pike’s story fanned fears that the law threatened both black and white children, who, once enslaved, might be difficult to legally recover. It was shrewd of Sumner to link the outrage stirred by the fictional Ida May to the plight of the real Mary—a brilliant piece of propaganda that turned Mary into America’s first poster child. But Mary hadn’t been kidnapped; she was born into slavery. I first learned of Mary in 2006 the same way the residents of Boston met her in 1855, by reading Sumner’s letter. That chance encounter led me on a 12-year-long quest to discover the truth about this child who had been lost to history, a forgotten symbol of the nation’s struggle against slavery. Now the true story of Mary Mildred Williams can be told in detail for the first time. In the reading room of the Massachusetts Historical Society, I held Mary’s daguerreotype, labeled “Unidentified Girl, 1855.” She would still be missing but for a handwritten note offering a clue to her identity: “slave child in which Governor Andrew was interested.” I went on to find the story of Mary and her family in thousands of documents spread across 115 years, beginning in the court filings and depositions of the Cornwells, the Virginia family who had owned Mary’s grandmother, Prudence Nelson Bell, since 1809. Prudence and her children were all so light as to “be taken to be white,” the courts stated. Their skin color was evidence of a then-common act: nonconsensual sex between an enslaved woman and a white member of the master class. Mary’s mother was Elizabeth, Prudence’s daughter with her mistress’s neighbor, Capt. Thomas Nelson. Mary’s father was Seth Botts, an enslaved man who was the son of his master. Elizabeth and Seth were married in the early 1840s. Mary, their second child, was born in 1847. In 1850, Mary’s father escaped to Boston via the Underground Railroad, changing his name along the way to Henry Williams to match his forged free papers. Through his remarkable charisma, Williams raised enough funds to buy the freedom of his children, his wife, her mother and four of Mary’s aunts and uncles. Abolitionist John Albion Andrew—the future governor of Massachusetts—was Williams’ lawyer, and he contacted Sumner to handle the funds needed to redeem Mary and her family from Virginia. Once freed, they traveled to Washington, where they met the senator. Sumner said the oldest Williams child, Oscar, was “bright and intelligent, [with] the eyes of an eagle and a beautiful smile.” But Sumner chose to photograph Mary and introduce her to journalists and Massachusetts legislators. Oscar was dark, like his father, while Mary was light, like her mother. Mary’s whiteness made her compelling to white audiences. Throughout the spring of 1855, Mary made headlines in Washington, New York and Massachusetts. In March, she sat onstage at Boston’s Tremont Temple as Sumner lectured to a crowd of thousands. And at least twice she appeared with Solomon Northup, a free-born black man who had, in fact, been kidnapped and enslaved; he had told his story in his memoir Twelve Years a Slave. “Little Ida May” faded from view after the Civil War, but I was able to piece together the basic facts of her life. She never married and did not have children. She resided mostly in Boston, near her family, working as a clerk in the registry of deeds and living as a white woman—a decision criminalized in the Jim Crow era as “passing.” The Rev. Thomas Wentworth Higginson, an abolitionist who knew her, said he “willingly lost sight of her” so she could “disappear...in the white ranks.” Mary moved to New York City in the early years of the 20th century; she died in 1921 and her body was returned to Boston and buried with her family in an integrated cemetery. I never found a single letter or document written by Mary herself, and no contemporary quotation of hers survives. Her own voice remains unheard. In March 1855, young Mary was taken to the offices of the New-York Daily Times, where reporters looked her over and expressed “astonishment” that this child was “held a slave.” Today, people are similarly surprised when I show them the daguerreotype of Mary and I point out she was born into slavery. They react the same as people did a century and a half ago, revealing that they still harbor some of the assumptions about race and slavery that Sumner tapped into when he first put Mary onstage. This article is a selection from the March issue of Smithsonian magazine Jessie Morgan-Owens is the Dean of Bard Early College in New Orleans and the author of Girl in Black and White: the story of Mary Mildred Williams and the Abolition Movement
d0162e4a257ee125ac23c5c6fe3c1c6f
https://www.smithsonianmag.com/history/enumerated-story-census-180974648/
The Enumerated Story of the Census
The Enumerated Story of the Census On January 21 of this year, Lizzie Chimiugak Nenguryarr, a 90-year-old elder in Toksook Bay, Alaska, became the first person to be counted in the 2020 Census. Workers from the U.S. Census Bureau and the agency’s director traveled to a rural corner of the state to kick off the decennial survey that helps apportion funds and representation. But just two months later, the Bureau paused all ground operations amid the outbreak of COVID-19—and has urged households to respond to the census online, by mail or by phone. The Bureau doesn’t know yet how many people will respond to the survey or how successful the push to make the census primarily digital will be but the public can rest assured that the Census will go on, as it always has under Constitutional mandate. That is what drew Andrew Whitby, a data scientist and author of the new book The Sum of the People: How the Census Has Shaped Nations, from the Ancient World to the Modern Age, to studying and writing about the census. “The core idea, that the government representing us reaches out to every single household in the nation and asks some basic questions, feels very democratic to me,” he told Smithsonian. “There's nothing that really matches it: not everyone votes, and not every household submits an income tax return. It's really the one time each decade that we pay attention to every single person living in this country.” Whitby spoke with the magazine about early instances of record-keeping, how the U.S. survey has evolved and what the future might hold for the census. This three-thousand-year history of the census traces the making of the modern survey and explores its political power in the age of big data and surveillance What’s the history of the census? What is the first recorded census and why did it come about? It's hard to pinpoint a “first” census. The idea of formally counting people probably arose as soon as we started living together in communities large enough to require formal government and taxation—around, say, 5,000 years ago. Various mythological traditions describe censuses not long after that—one conducted by China's Emperor Yu, or by Moses in the Old Testament—but we can't take that as history. The word “census” is Latin in origin, so in a linguistic sense, the first censuses were taken by the Romans, who were certainly doing this by around the middle of the first millennium B.C. But few if any of those counts would meet today's definition, which is essentially to count everyone in a given place at a given time. The biblical censuses, for example, excluded women—as did the Romans, as far as we know. There were, no doubt, small counts of villages or cities through history that would qualify, but today we mostly think about the census at a national level. So if you forced me to choose a first modern census I might pick that of Iceland in 1703, which recorded exactly 50,366 people and was about as accurate as a census today. That's nearly a century before the first U.S. census in 1790. A lot of countries have censuses. What’s unique about the decennial census in the United States? Why was it seen as so essential during the nation’s founding? First, the U.S. census is embedded in the Constitution itself—Article 1, Section 2—on a ten-year cycle. Whereas censuses in other countries might occur (or not) at the whim of a particular government, the U.S. census would go ahead every ten years no matter what (and since then, it has). Secondly, it was tied to a mechanistic, arithmetic approach to ensuring representative government, reapportioning Congress after each count. That was a novel idea. At the time the first U.S. census took place in 1790, Britain's Parliament suffered acutely from so-called “rotten boroughs”—unrepresentative districts with only a handful of voters. The U.S. had its own flaws, most notably the three-fifths compromise (which treated an enslaved person as three-fifths of a free person, for apportionment), but [the census] was sound. Last year, we saw much debate about the inclusion of a citizenship question on the 2020 census, which was eventually struck down by the Supreme Court. How has the census been politicized before? Statisticians would love for the American census to be an objective, scientific instrument, but that very innovation, to use it for apportionment, prevents it from ever being so. Censuses have consequences for the distribution of power, so censuses attract political scrutiny. After the 1920 census—the first in which a majority of the population was urban—Congress failed its constitutional duty to reapportion itself, as rural states opposed the loss of power that would have entailed. It wasn't until 1929 that a law was passed making apportionment automatic, as it is today. With the civil rights era, other more egregious wrongs in the American electoral system were gradually rectified, and so even more attention focused on the census itself. Since about the 1970s, seemingly arcane questions of methodology have often been settled in the courts, as with the now-rejected citizenship question. How have Americans’ changing concept of privacy affected the census? In the early days, some people saw the census as an intrusion, an illiberal imposition, but they didn't really have the language of privacy to describe that. The first inklings of privacy arose in respect of commercial statistics the census began to collect in 1820; businesspeople were worried that their competitors might learn their secrets. It was really in the late 19th and early 20th century that our current notion of privacy crystalized. The census—which was asking ever-more personal questions—responded by adopting a commitment to secrecy. As President Taft declared in 1910, no person can “be harmed in any way by furnishing the information required.” There have been lapses since, especially during times of war, but that is still the guiding principle. I learned from your book that we have a census-worker to thank for early computing machines. How has innovation changed the census? The logistics of reaching everybody and collecting their responses was always a challenge—and it continues to be. But for a long time, the compilation and analysis of responses was difficult, too. Thousands of clerks were employed to copy and condense this information, just to make a single table in a census publication. It's hard to imagine anything else in the 19th century that compares to it: the census was the original Big Data. But as the number of questions and the population grew, this became increasingly difficult. After the 1880 U.S. census, a real effort was made to find a better way of doing things. One bright young former census agent, Herman Hollerith, invented a machine to tabulate results by first encoding them in punch cards and then feeding those cards through an electric machine. It wasn't a computer, but the technology of punch cards underpinned computing up into the 1970s, and the company Hollerith founded eventually became IBM. I was surprised when filling out the census recently that it only included 9 questions. Previous iterations had dozens and dozens of questions. What accounts for that change? Where are governments getting the other information? First, well done on responding! It’s interesting that you were surprised—and you're not the only one who has said that to me. The core census—the set of questions asked of every household—has been very short since about 1970, never asking more than about ten questions. That was an intentional change from the earlier period. The Census Bureau became increasingly aware that it was missing people, and it is a general rule in survey statistics that the shorter the survey, the more likely people will respond. So the Bureau really streamlined things. What you might be remembering is that up to 2000, the census sent a "long form" questionnaire with additional questions to a random sample of households (1 in 6 in 2000). In 2005, the Census Bureau replaced the long form with the American Community Survey (ACS), which is administered continuously throughout the decade. Only 3.5 million households are asked to complete the ACS each year, so you're quite lucky if you encounter it! What does the future of the census look like? Does it still matter in the 21st century? Quite a few countries no longer perform a traditional enumeration, but instead maintain a live register of their entire population. That requires a population that is willing to notify the authorities every time, for example, that they move homes. It looks like an increasing number of countries are headed in that direction, though I'm not sure it could happen in the U.S. anytime soon, not least because the census is mandated by the Constitution. Does the census still matter? Can you have sensible government on the basis of facts without something like a census (be that a decennial enumeration or a population register)? I don't think so. The census generates the population denominators in so many of the statistics that we depend upon to understand our society. Now more than ever, we're seeing how important good data is. How much should we worry about 100 deaths from a new disease? Amongst other things, that depends very much on whether they occur in a town of 5,000 or a city of 8 million. How will COVID-19 affect this census? Like everything related to COVID-19, it's currently very hard to know. Concretely, the Census Bureau has now delayed its "field operations" for a month. That is less of a problem than it sounds, because they were always planning to encourage the majority of households to reply online (or by mail or phone). As of “Census Day,” April 1, nearly 40 percent had already done that. All the official communications from the Bureau express optimism and reiterate the goal of counting everyone. But short of a miracle, a large number of households—around 50 million in 2010—will not self-respond, and so will need to be enumerated in what the Bureau calls "nonresponse follow-up" operations. That usually requires field work—knocking on doors, interviewing occupants in person. That can only be delayed so far before it must run up against the due date for delivery of initial numbers, which is December 31, 2020. And of course, the later field work happens, the lower the quality of responses, since you're asking people to remember “Who lived here on April 1?” So everything depends on whether the current [COVID-19] conditions persist for a month, six months, a year. Anna Diamond is the former assistant editor for Smithsonian magazine.
777bfcc7e00723f2273efe2c26f137fb
https://www.smithsonianmag.com/history/escape-from-boko-haram-180956333/
Escape From Boko Haram
Escape From Boko Haram Shortly before six o’clock in the morning on August 30, 2014, Margee Ensign, president of the American University of Nigeria, met with her security chief in the large house that she occupies on campus, in Yola, near the nation’s eastern border, in Adamawa State. The news was bad. The chief, Lionel Rawlins, had gone to get the half-dozen security guards that Ensign was counting on to help her with a daring rescue mission, but the guards were asleep, or perhaps pretending to be, and couldn’t, or wouldn’t, be roused. The Hunt for Boko Haram: Investigating the Terror Tearing Nigeria Apart “They were afraid,” Rawlins later recalled. Running a college doesn’t often entail making split-second decisions about daredevil forays into hostile territory, but as this Saturday dawned for the energetic five-foot California native with a doctorate in international political economy, it was gut-check time. “The president looked at me and I looked at her, and I knew what she was thinking,” Rawlins said. “We’re going,” Ensign said. So they headed north in two Toyota vans, a suddenly meager contingent—Ensign, Rawlins, a driver and one other security guard—dashing down the crumbling two-lane highway through arid scrubland, deeper into remote country terrorized by the ruthless, heavily armed militant group called Boko Haram. Rawlins, a former U.S. Marine, had contacts with vigilante groups in northern Nigeria, and thought he might be able to summon them if the going got tough. “All the way up there I’m playing war games in my mind,” he remembered. After three tense hours on the road, expecting to be ambushed by terrorists wielding automatic rifles at any moment, the little convoy rounded a corner and Ensign saw 11 girls and their families and friends waving and yelling at the vehicles approaching in clouds of dust. The girls had attended a boarding school near Chibok, an obscure provincial town that is now famous because of the attack on the school the previous April. The astonishing crime attracted attention worldwide, including the Twitter campaign #BringBackOurGirls. On that nightmarish night of the April abduction, 57 of the 276 kidnapped girls were able to jump off the trucks that were spiriting them away, and flee into the bush. They eventually returned to their villages to spend the broiling summer with their families, fearing another kidnapping mission every night. One of those Chibok escapees had a sister at the American University of Nigeria, and it was she who approached Ensign in her campus office, pleading, “What can you do to help?” Ensign resolved to bring some of the girls who’d escaped to the university, where they could live and complete their secondary schooling before beginning college coursework, all on full scholarship. The girls and their parents warmed to the idea, then risked everything to make the extraordinary roadside rendezvous from their scattered small villages in the bush with the university president herself—an unforgettable encounter. “They were so scared, so skinny,” Ensign said of the girls. “They had no money, no food, and they had all their possessions in little plastic bags.” As the van engines kept running, Ensign leapt out, greeted the girls and their families and told them “with cool assurance” (Rawlins’ words) that all would be well. (“I didn’t get the fear gene,” Ensign later told me.) Quickly, about 200 locals gathered. Rawlins cast a wary glance at a group of men on the edge of the crowd whom nobody seemed to recognize. “We knew Boko Haram was in the area,” Rawlins said. He turned to Ensign and the others. “We’ve got ten minutes,” he told them. “Kiss everybody goodbye you want to kiss.” Then he began a countdown for the 22 people, girls and parents alike, who would go to Yola. “Five minutes. Three minutes. Two minutes. Get in the vans!” ********** Long before she assumed her post in Nigeria five years ago, Ensign was a citizen of the world. She was born and raised in affluent Woodland Hills, California, the youngest of five siblings, and began traveling at an early age, from Singapore to Turkey to France. “Both my parents were airline pioneers,” said Ensign. “My dad started loading bags at Western Airlines in 1940 and went on to become an executive at Pan Am. My mom was a flight attendant at Western when you had to be a registered nurse.” Ensign earned her PhD at the University of Maryland, and soon made a name for herself as an expert in economic development, especially in Africa, teaching at Columbia and Georgetown, running a management program for HIV/AIDS clinicians in East Africa, researching the causes of the 1994 Rwandan genocide. In 2009, she was teaching and serving as associate provost at the University of the Pacific when she was recruited to run the American University of Nigeria. Ensign’s job interview in Nigeria did not have an auspicious start. “I landed in Abuja, and nobody was there to pick me up,” she recalls. “So I hopped in a taxi, went to a crummy hotel and somebody called me at 2 a.m. and said, ‘Have you been kidnapped?’ I said, ‘No, I’m in a hotel.’ He said, ‘We’ve been looking for you all night!’” Eager for a new challenge, she signed on, despite her California physician’s dire warning that her severe peanut allergy would kill her—peanuts are a dietary staple in Nigeria. (She has landed in the hospital once, following a restaurant dinner involving an undeclared peanut sauce.) She was joined in Yola at first by her daughter, Katherine, then in her early 20s, who had grown up adventurous, accompanying her divorced mother to rural Guatemala and far-flung corners of Africa. After their two-week visit, Ensign escorted Katherine to Yola’s tiny airport. As the jet taxied down the runway and took off, Ensign began sobbing. “I turned around and there were hundreds of people standing around the terminal, watching. I remember thinking, ‘They probably think that a crazy person has moved to Yola.’ But as I walked toward the terminal, people reached out their hands and grasped mine. I knew that I would be OK there.” On the campus, Ensign settled into a four-bedroom villa (originally built for a traditional leader and his four wives), then set about remaking the university. She fired teachers, revamped security, forced out crooked contractors who were skimming millions of dollars. She commissioned buildings, including a hotel and library, started extracurricular programs, planted trees. And she required that all students spend time working directly with the underprivileged in Yola—tutoring street kids and coaching them in sports, distributing food and clothing in camps for people displaced by the fighting. The programs, she believes, serve as a strong counterweight to violent Islamist ideology. “Nobody knows any boys from Yola who joined Boko Haram,” she told me, sitting at a conference table in her office, a cheerful, sunlit space decorated with a large wall map of Adamawa State and a panel of colorful Nigerian folk art. ********** Half a century ago, Nigeria seemed poised for greatness. Oil had been discovered in the Niger Delta in 1956—four years before independence—promising to shower the country in riches and ease tensions between the country’s predominantly Muslim north and its Christian south, a legacy of arbitrary colonial border-making. Instead, a series of rapacious regimes, both military and civilian, looted the oil riches—stealing some $400 billion in the half century since independence, according to some sources—deepened the country’s destitution and fanned sectarian hatreds. Education in Nigeria has suffered, too. The secular education model introduced by Christian missionaries never took hold in the north, where an estimated 9.5 million children attend almajiri, or Islamic schools. Overall, of the nation’s 30 million school-age children, about 10 million receive no instruction. Eighty percent of secondary school students fail the final exam that permits advancement to college and the literacy rate is just 61 percent. There is a federal and state college system, but it is chronically underfunded; the quality of teachers is generally poor; and only about one-third of students are female. Ensign saw a chance to counter the corruption and dysfunction in Nigeria, which has the continent’s largest economy, by educating a new generation of leaders schooled in Western values of democracy, transparency and tolerance. Ensign “has an incredible commitment to building a nurturing environment in which students can learn,” says William Bertrand, a professor of international public health at Tulane and vice chairman of the AUN board. “Her whole vision of a ‘development university,’ which has evolved throughout her career, is extraordinary.” In fact, the values that Ensign holds dearest—secular education and intellectual inquiry—are anathema to Boko Haram. Boko Haram began in 2002 in Maiduguri, the capital of Borno State, the poorest and least developed corner of Africa’s most populous country. Its founder, a self-taught, fundamentalist preacher, Mohammed Yusuf, who believed that the world was flat and the theory of evolution was a lie, inveighed against Western education. In 2009, following escalating skirmishes in Maiduguri between his followers and Nigeria’s security forces, Yusuf was arrested and summarily executed by Nigerian police. A year later his radicalized disciples, who numbered about 5,000, declared war on the government. In a wave of atrocities across the north, 15,000 people have died at the rebels’ hands. The term “Boko Haram”—boko translates as “Western education” in the local Hausa language and haram as “forbidden” in Arabic—was conferred on the group by residents of Maiduguri and the local media. (Group members prefer to call themselves Jama’atu Ahlis Sunna Lidda’awati wal-Jihad, or People Committed to the Propagation of the Prophet’s Teachings and Jihad.) “Boko Haram” reflects Yusuf’s deep hatred of secular learning, which, he asserted, had become an instrument for Nigeria’s corrupt elite to plunder resources. That the terrorists target schools is no accident. At the all-female Chibok Government Secondary School, a sprawling compound of squat brown buildings surrounded by a low wall deep in the bush of Borno State, nearly all the students were Christians from poor farming villages nearby. For years, Boko Haram had been kidnapping girls and young women across the state, forcing them to marry and work as slaves in its camps and safe houses. The captors subjected the girls to repeated rapes, and, in a grisly reprise of the atrocities visited upon “child soldiers” elsewhere on the continent, forcing them to take part in military operations. Less than two months earlier, Boko Haram insurgents had killed 59 when they attacked a boys’ dormitory in neighboring Yobe State, locked the doors, set the building on fire and immolated the students. Those who tried to escape were shot or hacked to death. The government had subsequently shut down all public secondary schools in Borno State. But in mid-April, the Chibok school reopened for a brief period to allow seniors to complete college-entrance exams. The state government and the military had assured the girls and their parents that they would provide full protection. In fact, a single watchman stood guard at the gate on the April night that uniformed Boko Haram fighters struck. Many girls assumed the men were Nigerian soldiers who had come to protect the school. “But I saw people without shoes, with these caftans on their necks, and I started going, ‘I’m not sure,’” one 19-year-old woman recounted to Ensign in a videotaped interview. “Deep inside me I felt that these people are not soldiers, not rescuers....They were telling the girls to go and enter the car, and I jumped through the window, I started running. I heard voices calling from behind me, ‘Come, come.’ I just kept on running. I was just in the bush [but] I knew I would find my way back home.” As the 19-year-old made her getaway, a dozen armed men charged into the dorm. One group guarded the girls. Another ransacked the school’s kitchen and loaded vehicles with bags of rice, corn and other food. A third group set fire to the buildings. The attackers led the students out of the compound at gunpoint and into vehicles. A handful of young women had the presence of mind to grab tree branches and swing out of the truck beds to freedom. Others fled during a stop to relieve themselves in the bush. The girls ran through the pathless scrubland, past stands of acacias and baobab trees, desperately hungry and thirsty, driven by the fear of being caught at any moment. One by one, they stumbled back through the fields to their families’ mud-brick houses. Since then, Boko Haram forces have been repelled here and there, but they have not relented and none of the 219 female students held captive have been released. Last fall, fighters advanced to within 50 miles of Yola, imposing sharia law in the towns they occupied, burning books, kidnapping women, conscripting young men and executing those who resisted. Four hundred thousand people fled to Yola, doubling the city’s population. “Our employees were coming to us, saying ‘I have 20 people living at my house,’” Ensign recalls. “We started giving them rice, maize and beans...and every week the numbers were getting bigger.” The Nigerian military advised Rawlins to close the campus. “The parents, students and faculty were pressuring her, saying, ‘You gotta leave,’” recalled Rawlins, who had heard that the rebels would not dare attack Yola because they were spread too thin and the city was well defended. “She remained calm and said, ‘We will do what we have to do, in the best interests of the students.’ She was vigilant and steadfast. She never wavered.” Weeks after I visited Yola, two Boko Haram suicide bombers attacked the city’s market and killed 29 people; an off-duty university security guard was badly injured. Still, Ensign remains undeterred. “I’m extremely hopeful,” she told me. “The [new] government is making all the right moves.” ********** The American University of Nigeria was established in 2003 with a $40 million investment from Atiku Abubakar, a Nigerian multimillionaire businessman and the nation’s vice president from 1999 to 2007. Orphaned as a boy and educated by U.S. Peace Corps volunteers, Abubakar, who made his money in oil and real estate, remains something of a contradictory figure: Allegations of corruption have followed him throughout his career. At the same time, U.S. diplomats, educators and others say that Abubakar—known around the university as the Founder—has made a genuine commitment to improving Nigeria’s education system. “The man I’ve known for five years is devoted to education and to democracy,” Ensign told me. “I have never seen an inkling of anything that isn’t completely transparent and focused on trying to improve people’s lives.” Yola is a hard place—a sprawl of corrugated tin-roofed houses and diesel-choked streets, fiercely hot in the summer, a sea of mud during the rainy season—and Ensign works to conjure a modicum of comfort. She has sought to surround herself with bits of home, even installing in the arts and humanities building a coffee bar called Cravings, complete with real Starbucks paper cups. “It’s our little American island,” she said. She plays squash at the University Club and jogs along the campus roads. She consumes the Italian detective novels of Donna Leon and the Canadian detective series by Louise Penny, and sometimes relaxes with DVDs of “Madam Secretary” and “West Wing.” But the work is what keeps her going. She begins her day writing emails and discussing security with Rawlins, meets with faculty members and administrators, and teaches an undergraduate course in international development. There are weekly meetings with the Adamawa Peace Initiative, a group of civic and religious leaders she first convened in 2012. She’s also devoted to a “read and feed” program she started for homeless children who gather outside the university gates. Twice a week, under a big tree on campus, university staff members serve meals and volunteers read books aloud. “We’re up to 75 children,” she told me. “It helps to look in their faces and see that the little we’re doing is making a difference.” In April came a happy surprise. Over a crackling phone line in her office, Robert Frederick Smith, the founder and CEO of Vista Equity Partners, a U.S.-based private equity firm with $14 billion under its management, said he would cover the tuition, room and board for all the Chibok girls who’d escaped or evaded the terrorists—an offer worth more than a million dollars . (Ensign had brought ten additional escapees to the university, for a total of 21.) “It was like winning a sweepstakes,” she told me. “I started crying.” Alan Fleischmann, who handles Smith’s philanthropic efforts, said the investor “was frustrated that there was an enormous outcry after the kidnappings and then it vanished. The impression was that they were dead or going to die. Then he learned that some had escaped, and said, ‘Oh my God, they are alive.’” ********** Thirteen months after their desperate escape from the Boko Haram marauders, three Chibok girls—I’ll call them Deborah, Blessing and Mary—sat alongside Ensign in a glass-paneled conference room at the university’s new $11 million library. Ensign had allowed me to interview the young women if I would agree not to divulge their names and not to ask about the night of the attack. The young women seemed poised and confident, looked me forthrightly in the eye, displayed a reasonable facility with English and showed flashes of humor. They burst into laughter recalling how they gorged on a lunch of chicken and jollof (“one-pot”) rice, a Nigerian specialty, on their first day at the university—and then all became sick afterward. None had seen a computer before; they talked excitedly about the laptops that Ensign had given each of them, and about listening to gospel music and watching “Nollywood” movies (produced by the Nigerian film industry), Indian films and “Teletubbies” in their dormitory in the evenings. Blessing and Mary said they aspired to become physicians, while Deborah envisioned a career in public health. Deborah, an animated 18-year-old with delicate features, recalled the day last August when she walked for miles from her village to the rendezvous point, accompanied by her older brother. Exhausted after hiking through the night, she was also deeply unsettled by the prospect of being separated from her family. “But my brother encouraged me,” she said. After an emotional farewell, Deborah boarded the minivan with the other girls for the drive back to Yola. That first afternoon, Ensign hosted a lunch for the girls, and their parents, at the cafeteria. The adults fired worried questions at Ensign. “How long will you keep them?” “Do we need to pay anything?” Ensign assured them that the girls would stay only “as long as they wanted” and that they were on full scholarships. Later, she took the girls shopping, leading them through Yola’s market as they excitedly chose clothes, toiletries, Scrabble games, balls and tennis shoes. The girls admired their new sneakers, then looked, embarrassed, at Ensign. “Can you show us how to lace them up?” asked one. Ensign did. The campus dazzled the Chibok girls, but they struggled at first in class—particularly with English. (Their native language is Hausa, spoken by most in Borno State.) In addition to providing the laptops, Ensign arranged for tutoring in English, math and science, and assigned student mentors who live with them in the dormitory and monitor their progress. They remain tormented by thoughts of the Chibok students who remain in captivity. Three weeks after the abductions at their school, Boko Haram’s leader, Abubakar Shekau, released a video in which he threatened to sell the girls as slaves. The escapees watched with rising hope as the world focused on the Chibok tragedy. The United States, Britain and other countries put military personnel on the ground and provided satellite surveillance of the rebels. But as time went on, the mission to rescue the girls bogged down, the world turned away from the story, and the escapees felt a crushing sense of disappointment. In April, Nigerian President-elect Muhammadu Buhari—who campaigned on a pledge to crush Boko Haram—acknowledged that efforts to locate the girls so far had failed. “We do not know the state of their health or welfare, or whether they are even still together or alive,” he said. “As much as I wish to, I cannot promise that we can find them.” At the beginning of their time at the university, says Ensign, the Chibok women “only wanted to pray with one another.” But as the months passed, Ensign made it clear that alternatives were available to help them. “They didn’t understand the concept of counseling, but we said, ‘This is here if you want it.’” A turning point came last Christmas, when Boko Haram fighters attacked a village and murdered the father of one of the Chibok escapees at AUN. “[The student] was totally devastated,” Ensign says. “Her mom wanted to take her home, and we said, ‘Can we work with her a little bit?’ and her mom agreed.” Ensign brought in Regina Mousa, a psychologist and trauma counselor from Sierra Leone, who met with the girl, calmed her down and made the other girls see the benefits of counseling. Mousa set up thrice-weekly therapy sessions in the dormitory common room for groups of three to five girls, and conducted emergency individual interventions, sometimes in the middle of the night. Many of the girls, Mousa told me, were terrified of being alone, prone to collapse into sobbing, and, above all, stricken with guilt about having escaped while their friends were held captive. In therapy sessions, the girls go around the room, talking about their connections to the captives, voicing anguish as they imagine the others’ horrific lives. “I tell the girls that what happened has no reflection on them—it just happened at random, they were just in the wrong place at the wrong time,” Mousa says. “I tell them that they should now work hard, and aspire to do well so that these others will be proud, and that we’re sure that they will find them.” Recently she shared with them military and eyewitness reports “that the girls had been spotted alive in the Sambisa Forest,” a 200-square-mile former nature reserve 200 miles north of Yola. “That raised their hopes.” Still, reassurance does not come easy. Boko Haram has struck the Chibok region with impunity, returning to attack some villages three or four times. Many Chibok women at the university have lost touch with family members who “fled into the bush,” says Mousa, increasing the girls’ sense of isolation. “Whenever there is an attack, we have to go through the intensive therapy again,” says Mousa. “Everything comes crashing down.” On April 14, the one-year anniversary of the Chibok abductions, the women “were completely devastated,” Ensign recalled. “I went to meet with them. They were in each other’s arms, crying, they couldn’t talk. I asked ‘What can we do to help?’ They said, ‘Will you pray with us?’ I said, ‘Of course.’ We held hands and prayed.” Mousa met with them, too: “We talked again about the captured girls, and the need for the escapees to be strong for them and to move forward so that when the girls come back they can help them.” Ensign stays in close contact with the Chibok women, throwing open her office, visiting them frequently in the dormitory common room. “The girls are coming by to say hello, many times during the week,” she told me. “I have them over to my house several times a semester for dinner.” Ensign, who calls herself “the world’s worst chef,” has her cook prepare traditional Nigerian food. Ensign’s ambition is large—“I want to find and educate all Chibok girls who have been taken,” she told me—but she’s also a staunch advocate of the healing power of the small gesture. One hot Sunday morning some months ago, she first took the girls down to the University Club’s Olympic-size outdoor swimming pool, and distributed the one-piece Speedo bathing suits that she had purchased for them during a break in the U.S. The girls took one look at the swimsuits and burst into embarrassed laughter; some refused to put them on. Using gentle persuasion, Ensign—who grew up on the Pacific Coast and is a confident swimmer and surfer—nudged them into the shallow end of the pool. The girls have shown up most Sunday mornings—when the club is deserted and there are no men around. “None had ever been in the water, some were scared, most were laughing hysterically,” Ensign recalls. “They were like little kids, and I realized this is what they need. They need to capture that fun childhood.” Half a dozen of them, Ensign adds almost as an aside, have already achieved what she was hoping for: They can swim. Joshua Hammer is a contributing writer to Smithsonian magazine and the author of several books, including The Bad-Ass Librarians of Timbuktu: And Their Race to Save the World's Most Precious Manuscripts and The Falcon Thief: A True Tale of Adventure, Treachery, and the Hunt for the Perfect Bird.
76b918402137b21bd5e82811ad70224b
https://www.smithsonianmag.com/history/everything-was-fake-but-her-wealth-4621153/
Everything Was Fake but Her Wealth
Everything Was Fake but Her Wealth Ida Wood never had any intention of renewing contact with the outside world, but on March 5, 1931, death made it necessary. At four o’clock that afternoon, the 93-year-old did something she hadn’t done in 24 years of living at the Herald Square Hotel: she voluntarily opened the door, craned her neck down the corridor, and called for help. “Maid, come here!” she shouted. “My sister is sick. Get a doctor. I think she’s going to die.” Over the next 24 hours various people filtered in and out of room 552: the hotel manager, the house physician of the nearby Hotel McAlpin and an undertaker, who summoned two lawyers from the venerable firm of O’Brien, Boardman, Conboy, Memhard & Early. The body of Ida’s sister, Miss Mary E. Mayfield, lay on the couch in the parlor, covered with a sheet. The room was crammed with piles of yellowed newspapers, cracker boxes, balls of used string, stacks of old wrapping paper and several large trunks. One of the lawyers, Morgan O’Brien Jr., began questioning hotel employees, trying to assemble the puzzle of this strange and disheveled life. The manager said he had worked at the hotel for seven years and had never seen Ida Wood or her deceased sister. His records indicated that they had moved into the two-room suite in 1907, along with Ida’s daughter, Miss Emma Wood, who died in a hospital in 1928 at the age of 71. They always paid their bills in cash. The fifth-floor maid said she hadn’t gotten into the sisters’ suite at all, and only twice had persuaded the women to hand over soiled sheets and towels and accept clean ones through a crack in the door. A bellhop said that for many years it had been his habit to knock on the door once a day and ask the ladies if they wanted anything. They requested the same items every time: evaporated milk, crackers, coffee, bacon and eggs—which were cooked in a makeshift kitchenette in the bathroom—and occasionally fish, which they ate raw. Ida always tipped ten cents, telling him that money was the last she had in the world. From time to time they also requested Copenhagen snuff, Havana cigars and jars of petroleum jelly, which Ida massaged onto her face for several hours each day. She was five feet tall and 70 pounds, nearly deaf and stooped like a question mark, but her face still bore clear evidence of its former beauty. “You could see what an extraordinarily pretty woman she once was,” O’Brien noted. “Her complexion, in spite of her age, was as creamy and pink and unwrinkled as any I have ever seen. It was like tinted ivory. Her profile was like a lovely cameo.” She hadn’t had a bath in years. As the undertaker prepared her sister’s body just a few feet away, Ida Wood suddenly grew talkative. She said she had been a celebrated belle in the South and a prominent socialite in the North. Her husband was Benjamin Wood, the brother of Fernando Wood, former mayor of New York and perennial congressman. She had, despite her complaints to the bellhop, a good deal of cash stashed in her bedroom. At first they all thought she was senile. O’Brien called his elderly father, who confirmed at least part of her story. When he was a lawyer in the 1880s, he said, he had known Ida Wood quite well, both professionally and socially. She had been known for both her beauty and her business sense, and was indeed the widow of Benjamin Wood, erstwhile owner of the New York Daily News and brother of the mayor. He doubted she was destitute, and encouraged his son to take her case regardless of her ability to pay. The younger lawyer obliged and began looking into Ida’s finances. A representative from Union Pacific revealed that the sisters owned about $175,000 worth of stock and had not cashed their dividends for a dozen years. Examining the sale of the New York Daily News, O’Brien learned that Ida had sold the paper in 1901 to the publisher of the New York Sun for more than $250,000. An old acquaintance reported that she sold all of the valuable possessions she’d acquired over the years—furniture, sculptures, tapestries, oil paintings. An officer at the Guaranty Trust Company remembered Ida coming to the bank in 1907, at the height of the financial panic, demanding the balance of her account in cash and stuffing all of it, nearly $1 million, into a netted bag. Declaring she was “tired of everything,” she checked into the Herald Square Hotel and disappeared, effectively removing herself from her own life. Ida first came to New York in 1857, when she was 19 and determined to become someone else. She listened to gossip and studied the society pages, finding frequent mention of Benjamin Wood, a 37-year-old businessman and politician. Knowing they would never cross paths in the ordinary course of events, she composed a letter on crisp blue stationery: May 28, 1857 Mr. Wood—Sir Having heard of you often, I venture to address you from hearing a young lady, one of your ‘former loves,’ speak of you. She says you are fond of ‘new faces.’ I fancy that as I am new in the city and in ‘affairs de coeur’ that I might contract an agreeable intimacy with you; of as long duration as you saw fit to have it. I believe that I am not extremely bad looking, nor disagreeable. Perhaps not quite as handsome as the lady with you at present, but I know a little more, and there is an old saying—‘Knowledge is power.’ If you would wish an interview address a letter to No. Broadway P O New York stating what time we may meet. Although Benjamin Wood was married, to his second wife, Delia Wood, he did wish an interview, and was pleasantly surprised to find someone who wasn’t “bad looking” at all: Ida was a slight girl with long black hair and sad, languorous eyes. She told him she was the daughter of Henry Mayfield, a Louisiana sugar planter, and Ann Mary Crawford, a descendant of the Earls of Crawford. Ida became his mistress immediately and his wife ten years later, in 1867, after Delia died. They had a daughter, Emma Wood, on whom they doted. No one dwelled on the fact that she had been born before they wed. As the consort and then wife of Benjamin Wood, Ida had access to New York’s social and cultural elite. She danced with the Prince of Wales during his 1860 visit to the city. Less than a year later she met Abraham Lincoln, who stopped in New York on his way from Illinois to Washington as president-elect. Reporters called her “a belle of New Orleans” and admired the “bright plumage and fragile beauty that made her remarkable even in the parasol age.” Every afternoon around four o’clock, attended by two liveried footmen, she went for a carriage ride, calling for Benjamin at the Manhattan Club. He emerged right away and joined her. She sat rigidly beside him, tilting her fringed parasol against the sun, and together they rode along Fifth Avenue. There was one significant divide between them: Ida excelled at saving money, but Ben was a careless spender and avid gambler. He played cards for very high stakes, once even wagering the Daily News; luckily he won that hand. He often wrote letters to Ida apologizing for his gambling habits, signing them, “unfortunately for you, your husband, Ben.” The next day he would be back at John Morrissey’s gambling hall on lower Broadway, where he won and lost large sums at roulette. Once he woke Ida up, spread $100,000 across their bed, and giddily insisted she count it. Ida devised methods for dealing with Ben’s addiction, often waiting outside the club so that if he won she was on hand to demand her share. If he lost, she charged him for making her wait. She promised not to interfere with his gambling as long as he gave her half of everything he won and absorbed all losses himself. When he died in 1900, the New York Times wrote, “It was said yesterday that Mr. Wood possessed no real estate and that his personal property was of small value”—a true statement, in a sense, since everything he’d owned was now in Ida’s name. In the course of reconstructing Ida’s eventful life, O’Brien sent another member of his law firm, Harold Wentworth, back to the Herald Square Hotel. Harold brought Ida fresh roses every day. Sometimes she stuck them in a tin can of water; other times she snapped off their buds and tossed them over her shoulder. The firm also hired two private detectives to take the room next door and keep a 24-hour watch over her. While Ida smoked one of her slender cigars, slathered her face with petroleum jelly, and complained she couldn’t hear, Harold shouted at her about uncashed dividend checks, hoarded cash, the possibility of robbery and how she really should let the maid come in to clean the rooms. Although Harold tried to be discreet, word about the rich recluse of Herald Square got around. One day a man named Otis Wood came to the firm’s office, identified himself as a son of Fernando Wood’s and a nephew of Ida’s, and said he would like to help her. The firm took him, his three brothers and several of their children as clients. Soon afterward, Benjamin Wood’s son from his first marriage and some of his children came forward and hired their own firm, Talley & Lamb. They all seemed to agree that the best way to help Ida was to have her declared incompetent, which, in September 1931, she was. With the help of two nurses, and in the presence of members of both factions of the Wood family, Ida was moved to a pair of rooms directly below the ones she had occupied for so many years. She wept as they escorted her downstairs. “Why?” she asked. “I can take care of myself.” Her old suite was searched and inside an old shoebox they found $247,200 in cash, mostly in $1,000 and $5,000 bills. They thought that was all of it until the following day, when a nurse tunneled a hand up Ida’s dress while she slept and retrieved an oilcloth pocket holding $500,000 in $10,000 bills. Next they examined Ida’s 54 trunks, some stored in the basement of the hotel, others in an uptown warehouse. Inside lay bolts of the finest lace from Ireland, Venice and Spain; armfuls of exquisite gowns, necklaces, watches, bracelets, tiaras and other gem-encrusted pieces; several $1,000, $5,000, and $10,000 gold certificates dating back to the 1860s; a gold-headed ebony stick (a Wood family heirloom that had been a gift from President James Monroe), and an 1867 letter from Charles Dickens to Benjamin Wood. Each trunk was taken to the Harriman National Bank, where the contents were placed in vaults. In an old box of stale crackers they discovered a diamond necklace worth $40,000. They dug up her sister’s coffin and the undertaker inspected its contents, finding nothing but Mary Mayfield’s remains. There was not much left to do except wait for Ida Wood to die. In that regard, as in everything else, Ida proved stubborn. Reporters, as yet unaware of brothers Homer and Langley Collyer living in similar squalor in Harlem, descended upon her hotel room. Her mind wandered from the past to the present but remained ever suspicious and alert. When nurses brought her food she asked, “How much did this cost?” If the answer was more than a dollar, she pushed it away and said, “It’s too much. Take it back. I won’t eat it.” On several occasions, when the nurses weren’t looking, she shuffled to a partly opened window and tried to scream above the roaring traffic of Herald Square: “Help! Help! I’m a prisoner. Get me out of here!” Other times she treated the nurses as her confidantes, sharing what they believed were cherished memories. “I’m a Mayfield,” she told them. “They used to spell it M-a-i-f-i-e-l-d in the old days, you know. I grew up in the city of New Orleans, a wonderful city.… My mother had a very good education, you know. She spoke German, Spanish and Italian, and she wanted me to be educated too, so she sent me to boarding school in New Orleans.” Letters from these Southern relatives, the Mayfields, began to pour in, but Ida was too blind to read herself. Crawfords also jockeyed for attention, all of them ready to prove their ancestry to a branch of the Earls of Crawford. One missive addressed Ida as “Dear Aunt Ida” and promised to take care of her. She claimed to be the “daughter of Lewis Mayfield.” The nurse who read the letter to Ida asked if she knew the writer, and Ida replied that she never heard of her. All told, 406 people claimed to be her heirs. By now Ida, too, was waiting for her death. She didn’t bother to dress, wearing her nightgown and ragged slippers all day, and stopped battling any attempt to take her temperature. She had nothing left but the exquisite fantasy she’d created, one that—to her mind, at least—had seemed more right and true with each passing year. Only after she died, on March 12, 1932, did all of the lawyers and supposed relatives unravel the mystery of her life: Her father wasn’t Henry Mayfield, prominent Louisiana sugar planter, but Thomas Walsh, a poor Irish immigrant who had settled in Malden, Massachusetts, in the 1840s. Her mother had little formal education and grew up in the slums of Dublin. Ida’s real name was Ellen Walsh, and when she was in her teens she adopted the surname Mayfield because she liked the sound of it. Her sister Mary took the name too. Emma Wood, her daughter with Benjamin Wood, wasn’t her daughter at all, but another sister. Her husband never divulged her secrets. Toward the end, when the shades were drawn and the tattered lace curtains pulled tight, Ida shared one final memory. When she was a young girl she noticed a sign in a storefront window: “Your Future and Fortune Told.” She saved up the money for a consultation. In the dingy parlor, the old gypsy seer traced rough fingertips over her palms and spoke in dulcet tones. “My dear,” she said, “you are going to be a very lucky girl. You are going to marry a rich man, and get everything you want out of this life.” Ida believed it was true—and that, at least, they could never take away. Sources: Books: Joseph A. Cox, The Recluse of Herald Square. New York: the MacMillan Company, 1964; Benjamin Wood and Menahem Blondheim, Copperhead Gore: Benjamin Wood’s Fort Lafayette and Civil War America. Bloomington, IN: Indiana University Press, 2006. Articles: St. Clair McKelway, “The Rich Recluse of Herald Square.” The New Yorker, October 31, 1953; “Recluse Hid $1,000,000 in Her Hotel Room.” New York Times, March 13, 1932; “406 Claimants Out As Ida Wood Heirs.” New York Times, September 1, 1937; “Recluse Glimpses Wonders of Today.” New York Times, October 8, 1931; “Recluse’s Trunks Yield Dresses, Jewels, and Laces Worth Million.” New York Times, October 17, 1931; “Aged Recluse, Once Belle, Has $500,000 Cash In Skirt.” Washington Post, October 10, 1931; “Ida Wood’s Early Life Is Revealed.” Hartford Courant, September 16, 1937; “Who Gets This $1,000,000?” Seattle Sunday Times, August 18, 1935; “Mrs. Wood’s Forty Trunks Will Be Opened Today.” Boston Globe, November 2, 1931. Karen Abbott is a contributing writer for history for Smithsonian.com and the author of the books Sin in the Second City and American Rose. Her forthcoming book, Liar, Temptress, Soldier, Spy, will be published by HarperCollins in September.
91f8c626616866a36e1a8eece0d9f7ed
https://www.smithsonianmag.com/history/everything-you-didnt-know-about-clarence-darrow-14990899/
Everything You Didn’t Know About Clarence Darrow
Everything You Didn’t Know About Clarence Darrow Clarence Darrow exists foremost in the public memory as Spencer Tracy, who played a lawyer based on Darrow in the 1960 movie Inherit the Wind. That film, in turn, was based on Darrow’s 1925 defense of a Tennessee educator accused of breaking a state law banning the teaching of evolution in public schools. (Darrow lost The State of Tennessee v. Scopes, or the “monkey trial,” as it was known; the law was later repealed.) But as John A. Farrell makes clear in his new biography, Clarence Darrow: Attorney for the Damned, Darrow’s life was even more tumultuous than that sensational trial would suggest. Before Darrow became the champion of labor, proponent of the poor and defender of the most hopeless of death-row cases, he was a corporate lawyer—and for a railroad, no less. What turned him away from a career as a fat cat? He couldn’t look at himself in the mirror. He was at heart one of the most compassionate people you could imagine meeting, and that part of him was always at war with the striver, the go-getter. But whenever the chips came down, they always came down on the side of the guy who needed a good lawyer. Depending on how he was fixed at any given time, a third to a half of his cases he was handling for free for indigent clients. He didn’t charge big fees for his most notorious clients if there was a good cause behind it. It was just conscience, basically, that forced him to give up that job as counsel for the Chicago & North Western Railway. He was also prompted by his boss, his patron at the railroad, who had a sudden heart attack and died, so Darrow’s decision was helped along by the fact that he no longer had a career there. He operated for a while as a political lawyer in Chicago when the words “politics” and “Chicago” were pretty much synonymous with “graft” and “corruption.” How did he avoid the taint of that time and place? He didn’t, entirely. He got involved in several of the scandals of the time, but even crooked politicians need a good lawyer, and sometimes the law is applied in courts that are straight. So there was a respect for Darrow among the political boys for his ability to actually get things done, to run things, while they pursued their tricks and their deals. At the same time he was an idealist, and in fact one of the movers in the attempt by the Populists to spread their campaign from the farms, where it was born, to the cities. Of course, William Jennings Bryan became Darrow’s most famous foil during the monkey trial. Yet the two men were aligned in the 1896 presidential campaign. What brought them together, however briefly? You had the growth of the Populist movement—a widespread feeling out in the West and Midwest that the financiers of the East were using the gold standard to keep the average farmer and the average working man in poverty. For the first time, in Chicago in 1896 [at the Democratic National Convention], you had a major party declare that it was going to represent the poor. That was Bryan’s amazing feat of political rhetoric: he was this young, unknown congressman and he stood up there and he captivated that convention hall and brought the Populists and the Democrats together. Darrow was part of that same movement, but he never particularly cared for Bryan as a person. He thought Bryan was too religious and basically too stupid to lead a major party, and it really grated on him that Bryan got the presidential nomination three times. So their rivalry began to simmer and fester, and when Darrow had a chance to ambush Bryan in the courtroom in Dayton, Tennessee, in 1925, he took full advantage of it. In Darrow’s day there was open warfare between labor and capital. He stepped into that war in a major way in Idaho in 1907, when he defended Big Bill Haywood and two other unionists charged with murdering a former governor. You write that, “Of all of Darrow’s courtroom speeches, his summation in the Haywood case was arguably the most brilliant, and dangerous.” In what way brilliant, and in what way dangerous? It’s brilliant in its eloquence. In those days attorneys and prosecutors could speak for up to 12 hours, or even longer—Darrow, in the Leopold and Loeb case, spoke for three days. The Haywood summation is long, and to the modern ear it tends to wander, but you have to think of him standing in the courtroom and speaking to the jury, and going back and forth over his major themes like a weaver. That speech is amazing, for his ability both to tear apart the prosecution’s case and to draw from the jurors—who were not union men, but were working men—an appreciation for what labor was trying to do. It was extraordinarily dangerous because he was using a plea for a client as a soapbox. He made a very political speech, talking in almost socialistic terms about the rights of the working class, and there was a danger that the jury would react against that—as one of his juries later did in Los Angeles. But it was a very small courtroom and the defense table was right up against the jurors; over the course of 90 days he got a very good sense of who they were, talking during breaks, listening to them, watching them as they listened to the testimony. I think it was an informed bet he was willing to make. In that trial, there was a whisper that Darrow, or someone working for the defense, tried to bribe potential witnesses. And after he defended two brothers accused of firebombing the Los Angeles Times in 1911, Darrow himself was tried—twice—on charges that he’d bribed jurors in that trial. He was acquitted the first time, but the second case ended with the jury hung 8-4 for convicting him. So: Did he do it? In the book I argue that he almost certainly did. It’s going to be a puzzle for historians forever; I don’t think we’re ever going to find one piece of paper on which Darrow wrote to one of his cohorts, “Hey, did you make sure you got the juror that bribe?” But all the evidence indicates—well, there certainly was an attempt by the defense to bribe jurors; the question is, to what extent did Darrow know about it and to what extent did he actually inspire it? One of the most compelling things for me was to find in his mistress’s diary from years later that she concluded he had the capacity to do it. She had been his most faithful supporter and had insisted on his innocence. He was very careful in talking to his friends and family about the charges. He never actually said, “I didn’t do this.” He pled not guilty, but he believed that guilt was always a matter of motive and intent. And in this case he thought he had a good motive and a good intent because he was fighting for labor. Darrow grew up on a hardscrabble farm in Ohio and told his friend Jane Addams, “I have never been able to get over the dread of being poor, and the fear of it.” But he had a pretty complicated relationship with money, didn’t he? He did, and it got him into a lot of trouble. His law partner for a time was Edgar Lee Masters, the famous poet, and Masters said it was the money that ruined him. And Darrow did need money, because, for one thing, he was a womanizer. He was supporting two households—his first wife and their son, and then his second wife. It also cost money to run around chasing other women. Another problem is that he was an awful investor. His second wife, Ruby, once wrote to one of his sisters and said, well, Clarence’s new idea is for a ranch in California, and I guess that’s better than an empty or gold mine or any of the other crackpot schemes he always jumps at. One of the sadder things about his life is that he finally got his money into a sound natural-gas company in Colorado, and when he sold his interest in the 1920s he had enough money to retire. And then he lost it all in the crash, so he had to go out in his 70s making speeches and public appearances and doing stunts like defending Benedict Arnold on the radio, just to keep the wolf away from the door. And speaking of complicated relationships: as you said, Darrow was twice married and a serial philanderer. What was up between Darrow and women? There is a philosophical consistency, in that he was an advocate of the free-love movement of his day. In Victorian America the times were so repressive, particularly for women. One of Darrow’s clients was a well-respected gynecologist from Chicago who wanted to write in the American Medical Association journal that it was okay to have pleasure from sexual relations. The other doctors in the AMA said no, we’re not going to say anything like that; sex is for procreation; it might be for pleasure if men can go to bordellos, but certainly not for women at home. That’s the kind of climate that the free-love movement moved against, and Darrow was a supporter of it. As far as I can tell, he was up front with his mistresses and the young ladies that he met in the free-love cause, and they agreed that this was a natural inclination and you shouldn’t try to repress it. Politically, he was a very early feminist; he argued in the 1880s for giving women the vote. But later he soured on the suffragette movement because it aligned itself with Prohibition, which he hated. He didn’t speak or campaign against giving women the vote, but there was a marked loss of enthusiasm for what he had thought would be a very good thing for the country. Darrow loved the company of friends and the balm of candid conversation, but at times some of his friends questioned his choice of cases and causes. Why? There was a feeling, at least up until the trial in Los Angeles, that he was motivated by money, that he saw the opportunity for a very skilled labor lawyer and took it. You find newspaper editorials and people saying, for somebody who’s talking about the cause of labor, he sure is making a lot of money off the poor working man. But after Los Angeles and his disgrace, he had a second act, and it was redemptive. He represented an awful lot of indigent clients and took a lot of civil rights cases. The two major cases of his career came when he was in his 60s—the Leopold and Loeb case and the monkey trial. Also his defense in the Sweet trial, which is the key in deciding whether you like him or not. After the monkey trial he was without a doubt the most famous trial lawyer in America. He could have commanded titanic fees from any corporation in America; they would have loved to have him. And instead, he used his fame to go to Detroit and represent for $5,000 over nine months a group of African Americans who had been trapped in a house by a racist mob at a time when the city was whipped into a hateful frenzy by the Ku Klux Klan. [The homeowner, an African American physician named Ossian Sweet, had just bought the house in a white neighborhood; when the mob stoned his house, some men in the house returned fire with guns, killing a white neighbor. The 11 men in the house were charged with murder.] He got them acquitted in an amazing trial that basically put down in law something we take for granted today—that if we believe a person has the right to defend his home, then African Americans have that right, too. Darrow was a founding attorney for the NAACP, and this was a big case for the NAACP. So that’s how he chose to invest all the fame and potential riches he could have had after his triumph in Dayton, Tennessee. Tom Frail is a senior editor for Smithsonian magazine. He previously worked as a senior editor for the Washington Post and for Philadelphia Newspapers Inc.
c6b60ed08e6fc566d8bc0d26e02f610a
https://www.smithsonianmag.com/history/extraordinary-disappearing-act-nazi-banned-novelist-180976824/
The Extraordinary Disappearing Act of a Novelist Banned by the Nazis
The Extraordinary Disappearing Act of a Novelist Banned by the Nazis The greatest trick that Irmgard Keun ever played was convincing the world she didn’t exist. Once an acclaimed German novelist, the then 31-year-old Keun had been living the life of an exile in either France or the Netherlands since 1936. Three years earlier the Nazis had condemned her enormously popular recent novels, which dealt with subjects like independent women in Berlin’s seedy underworld, as “anti-German.” Keun was in Holland in 1940 when the fascists began their occupation of the Netherlands. With apparently nowhere left to turn, she took her own life—or so a British newspaper reported in August of the same year. But the story was false. Keun had used it as cover to return to Germany to see her parents. When you’re that good at disappearing, sometimes you can’t help staying hidden. Keun lived in obscurity until the 1970s, when her books were rediscovered by a new generation of German readers. The young Germans of the ’70s were trying to reckon with their nation's horrific past, which many of their parents were directly implicated in, so Keun’s steadfast refusal to conform to Nazis strictures during the Third Reich must have come as an inspiration to them. Recent English translations are now introducing those works to a broader audience and restoring Keun’s status as a unique, fearless novelist of interwar Germany. Her stories of average Germans, mostly young women, attempting to make their way in the world despite fascism are refreshingly ironic — unless, of course, you’re the fascist being belittled. Keun’s disappearing act, amid the general chaos of Germany in the interwar and post-war periods, makes piecing together the author’s life a bit of a challenge. Award-winning translator Michael Hofmann has produced two recent English-language versions of Keun’s novels, yet is still unsure of her life story. “Definite biographical facts about Keun are very thin,” he admits. We know that Keun was born in Berlin in 1905 and began her professional life as an actress around 1921. She later turned her attention to writing, publishing the novels Gilgi, One of Us in 1931 and The Artificial Silk Girl in 1932. Both sold well, making Keun rich and famous. In a contemporary review, the New York Times praised Gilgi’s “freshness” as standing “in delightful contrast to the books written by men.” But popularity came with a price. The Artificial Silk Girl tells the story of a young woman in contemporary Berlin who resorts to prostitution and theft on her quest to become a cabaret star. The Nazis had come to power the same year the book was published and disapproved of it vehemently. As one critical reviewer wrote, Keun produced “vulgar aspersions against German womanhood,” which were quite incompatible with Nazi ideas of refinement. “Anything like an autonomous woman was anathema to the Nazis,” Hofmann observes. Accordingly, Keun was blacklisted. The last novel from the acclaimed author of The Artificial Silk Girl, this 1950 classic paints a delightfully shrewd portrait of postwar German society. “She despised them,” Hofmann says of Keun’s feelings towards the Nazis. “To her, they were idiots dressing up in uniforms and shouting and goose-stepping about the place.” Following her blacklisting and unsuccessful attempt to sue the Gestapo for the loss of income resulting from their confiscation of her work in 1933, Keun fled Germany for expatriate life, shuttling between France and the Netherlands. She joined other German writers in exile, like Thomas Mann, Stefan Zweig, and Joseph Roth, all whom had likewise run afoul of Nazi censors. Unlike the historical fiction produced by those men, Keun’s work in exile remained focused on daily realities, becoming more and more explicitly political, though always with an ironic edge. In After Midnight, published in 1937, a young woman falls in love with her cousin, only to have her aunt sabotage the relationship by informing the police that the protagonist has insulted Nazi leader Hermann Goring. Keun continued publishing, but the instability of exile, Nazi censorship preventing her from reaching German readers, and the growing certainty of war diminished her audience. Her small circle of fellow exiles and Dutch readers was minuscule compared to Keun’s former readership. The Artificial Silk Girl had sold nearly 50,000 copies before being banned; Hofmann estimates that her subsequent novels reached less than five percent of those readers. When news began circulating that she had killed herself, it certainly wasn’t unbelievable. “She was still in Holland, in 1940, and her suicide was announced in a British newspaper,” says Hofmann of Keun. “Somehow, she took advantage of that, got some false papers, and went back to her parents just across the border, in Cologne.” The finer details of this episode remain unclear. Whether Keun intentionally worked with an editor to place a false story, or whether she merely took advantage of a bureaucratic error due to the Nazi invasion, the fiction of her untimely demise persisted. How she thereafter crossed the border between the Netherlands and Germany, whether by obtaining papers through the seduction of a Nazi officer or straightforward forgery, is also a mystery. Regardless, Keun—or “Charlotte Tralow,” as became her nom de plume—was back in Germany. Keun’s riveting return home has parallels to her novel Ferdinand, the Man with the Kind Heart. Written in 1950, Ferdinand is the story of a conscripted soldier who returns to Cologne from a prisoner-of-war camp to grapple with post-war life. In Keun’s signature ironic yet endearing style, the novel offers readers a glimpse of Germans amid the rubble and rations, women hoarding for sport and men celebrating their proof of de-Nazification. Germany is supposedly returning to normal, but Ferdinand, the narrator, just wants to return to living: When I got back to Germany from camp, I still wasn’t a private individual. I wasn’t any Herr Timpe, Ferdinand Timpe. I was a returnee. … To be honest, I can’t stand the word “returnee.” It sounds a bit like the name of a vacuum cleaner or something. Something maneuverable. Gets in the corners and edges. It has something that smells of home and being looked after. Home for the homeless, home for fallen women, home for convicts, home for neglected children. Unlike the defeated former Nazis or belatedly victorious anti-fascists, Ferdinand does not want to be part of the political life of Germany. He admits that, during Hitler’s rise, he was not involved in either their coup nor the opposition and was only dragged into the war. Now that World War II is over, he sees the Cold War simmering (Germany was formally divided between East and West in 1949) and once again wants no part of it. He wants to be a person, rather than a political subject. This insistence on independence, however, does push the reality of collective crimes like the Holocaust out of sight, where it is ignored by both Ferdinand and Keun. “He’s charming, woozy, passive,” says Hofmann of Ferdinand. “Social and political movements mystify him, leave him indifferent. He’s like a speck of saffron swept up by a magnet, along with all the iron filings.” Published for the first time in English last month, Ferdinand was Keun’s final novel. She spent the remainder of her life in or around Cologne, where she would die in 1982. Her former literary fame eluded her until the 1970s, when her books began to be reissued in German. English translations, some by Hofmann, some by his late colleague Anthea Bell, began appearing in the 2000s, and the literary world once again praised Keun as a unique voice amid interwar German writers. The tragedy of this recent praise is that Keun faced such stark consequences in her own time for her novels. While the Nazis undoubtedly spared few of their victims, foremost the Jews that Ferdinand forgets, Keun puts into his mouth a pair of lines that might have been reserved to summarize the absurdity that defined her career: “It’s not so easy to write a love story in today’s Germany. There are strict laws.” Arvind Dilawar is a writer and editor whose articles, interviews, and essays on everything from the spacesuits of the future to love in the time of visas have appeared in Newsweek, The Guardian, Vice, and elsewhere.