id stringlengths 32 32 | url stringlengths 31 1.58k | title stringlengths 0 1.02k | contents stringlengths 92 1.17M |
|---|---|---|---|
87b76385d8c01d9e20d0e6fed448999a | https://www.britannica.com/biography/Wu-Sangui | Wu Sangui | Wu Sangui
Wu Sangui, Wade-Giles romanization Wu San-kuei, (born 1612, Liaodong, China—died Oct. 2, 1678, Hengzhou, Hunan), Chinese general who invited the Manchu of Manchuria into China and helped them establish the Qing dynasty in 1644. Later, in southwestern China, he led a revolt against the Qing in an attempt to set up his own dynasty.
Wu had been the Ming general in charge of defending the northeast frontier against the Manchu. When the imperial capital at Beijing was attacked by the rebel bandit leader Li Zicheng, Wu’s forces were summoned to aid in raising the siege, but the city fell (April 1644) before his arrival. Li then advanced against Wu, who appealed to the Manchu for aid. A combined force of Ming and Manchu troops drove Li from Beijing, where the Manchu then set up the Qing dynasty. Although loyal Ming officials beseeched Wu for aid in restoring the Ming dynasty, he accepted high rank from the Manchu and for nearly 30 years fought for the Qing cause.
In 1659 Wu was put in charge of eliminating the remnants of Ming resistance in the southwest, and to this end he was given civil and military control of the southwestern province of Yunnan. With these powers he created an independent satrapy in Yunnan and neighbouring Guizhou province, collecting taxes and developing trade monopolies in the area. At the same time two other commanders set up similar satrapies in the neighbouring southern provinces of Guangdong and Fujian, and South China became an independent power that rivaled the Qing government in the north.
In 1673, when the Qing dynasty tried to check these southern kingdoms, Wu led them in a rebellion. In 1674 he advanced into central China but then hesitated, possibly because the Manchu were holding his son hostage. The Manchu then seized the initiative, but Wu still kept his force active. In March 1678 Wu set up his own dynasty, named Dazhou, in Hengzhou (now Hengyang), Hunan province, and proclaimed himself the emperor. Later that same year, Wu died of dysentery. His grandson continued the rebellion until 1681, when it was finally crushed. The incident is known in Chinese history as the Revolt of the Three Feudatories.
|
d3368b0d2c6fa941037cdd43388cdc79 | https://www.britannica.com/biography/Wu-Zhen | Wu Zhen | Wu Zhen
Wu Zhen, Wade-Giles romanization Wu Chen, (born 1280, Jiaxing, Zhejiang province, China—died 1354), one of the group of Chinese painters later known as the Four Masters of the Yuan, or Mongol, dynasty (1206–1368). His fame derives particularly from his incorruptible life as a recluse (and diviner) away from the Mongol court.
Wu, like others of the group, sought stylistic inspiration in the past (especially from Five Dynasties masters such as Juran), but his paintings are not overwhelmingly determined by that desire. Rather, they are a combination of the new attitudes of the Yuan period with sometimes conservative tendencies traceable to the Southern Song dynasty (1127–1279). Wu is generally associated with landscapes, especially scenes of fishermen.
|
38424faf5f366e36d9450b67da123263 | https://www.britannica.com/biography/Wu-Zheng | Wu Zheng | Wu Zheng
Liu Yin’s contemporary Wu Zheng (1249–1333) further developed the learning of the mind. He fully acknowledged the contribution of Lu Jiuyuan to the Confucian tradition, even though as an admirer of Xu Heng he considered himself a follower of Zhu Xi. Wu assigned himself the challenging task of…
|
6182d32c059956f856e5b86fb0bc6465 | https://www.britannica.com/biography/Wyclef-Jean | Wyclef Jean | Wyclef Jean
Wyclef Jean, byname of Nel Ust Wyclef Jean, original name Nel Ust Wycliffe Jean, (born October 17, 1969, Croix des Bouquets, Haiti), Haitian rapper, producer, and philanthropist whose dynamic, politically inflected rhymes and keen ear for hooks established him as a significant force in popular music.
Born in a suburb of Port-au-Prince, Jean was raised by relatives after his parents immigrated to the United States. At age nine he and his younger brother joined their parents in Brooklyn, New York. The family moved to Newark, New Jersey, when he was a teenager. Jean’s father, a Nazarene minister, prohibited rap music and encouraged Jean to channel his musical talents into the church choir. Nonetheless, Jean joined Tranzlator Crew (later known as the Fugees), a rap group founded by Prakazrel (“Pras”) Michel and Michel’s friend Lauryn Hill, in the late 1980s.
Jean studied music at Five Towns College in Dix Hill, New York, before dropping out to concentrate on his rapping. He continued to perform with Michel and Hill, and in 1994 they released their debut album, Blunted on Reality. Though the album was only moderately successful, the trio continued to record and in 1996 released their sophomore effort The Score as the Fugees. The recording, which innovatively blended elements of jazz, soul, reggae, and hip-hop, sold more than 18 million copies and won two Grammy Awards.
The members of the group then embarked upon solo efforts, a trajectory some observers attributed to an ill-fated affair between Jean and Hill. In 1997 Jean released Wyclef Jean Presents the Carnival Featuring Refugee All Stars, which mirrored the syncretic style of his efforts with the Fugees. Between albums, he collaborated with performers including Carlos Santana, for whom he produced the song “Maria, Maria,” and Whitney Houston, for whom he cowrote the hit “My Love Is Your Love.” In 2000 Jean followed up with The Ecleftic: 2 Sides II a Book and in 2002 released Masquerade. Further efforts included Sak Pasé Presents Welcome to Haiti: Creole 101 (2004), Carnival Vol. II: Memoirs of an Immigrant (2007), Toussaint St. Jean: From the Hut, to the Projects, to the Mansion (2009), Carnival III: The Fall and Rise of a Refugee (2017), and Wyclef Goes Back to School, Vol. 1 (2019).
In 1998 Jean founded the Wyclef Jean Foundation (later known as Yéle Haiti). The organization raised money and engineered programs to assist victims of poverty in Haiti. Following the Haiti earthquake of 2010, Yéle Haiti raised several million dollars for those affected. Jean announced in August of 2010 that he would run for president of Haiti, but he was deemed ineligible because he was not a resident of the country. Yéle Haiti shuttered in 2012 following investigations by the New York attorney general’s office and by media outlets that concluded that little of the money raised by the charity had actually gone to assist victims of the earthquake.
Jean chronicled his life in Purpose: An Immigrant’s Story (2012; with Anthony Bozza).
|
20c25d9670111de9db25779ebe2d395e | https://www.britannica.com/biography/Wyndham-Halswelle | Wyndham Halswelle | Wyndham Halswelle
…deliberately impeding the path of Wyndham Halswelle of Great Britain. A new race was ordered, but the other qualifiers, both American, refused to run. Halswelle then won the gold in the only walkover in Olympic history. (See also Sidebar: Dorando Pietri: Falling at the Finish.) Henry Taylor of Great Britain…
…deliberately impeding the path of Wyndham Halswelle of Great Britain. A new race was ordered, but the other qualifiers, both American, refused to run. Halswelle then won the gold in the only walkover in Olympic history. See also Sidebar: Dorando Pietri: Falling at the Finish. Henry Taylor of Great Britain…
|
3de1b74cd4130fdd43c839aacfad0242 | https://www.britannica.com/biography/Wynkyn-de-Worde | Wynkyn de Worde | Wynkyn de Worde
Wynkyn de Worde, original name Jan Van Wynkyn, (died 1534/35), Alsatian-born printer in London, an astute businessman who published a large number of books (at least 600 titles from 1501). He was also the first printer in England to use italic type (1524).
He was employed at William Caxton’s press, Westminster (the first printing enterprise in England), from its foundation in 1476 until Caxton’s death in 1491, when he assumed control of the business. In 1500/01 he moved his press from Westminster to Fleet Street, London. Whereas Caxton and numerous continental European contemporaries were also editors and translators, Wynkyn was purely a commercial printer.
|
11271e5adb4240452ade9a6fae576c48 | https://www.britannica.com/biography/Xavier-Le-Pichon | Xavier Le Pichon | Xavier Le Pichon
…analysis by the French geophysicist Xavier Le Pichon proved that the plates did indeed form an integrated system where the sum of all crust generated at oceanic ridges is balanced by the cumulative amount destroyed in all subduction zones. That same year the American geophysicists Bryan Isacks, Jack Oliver, and…
|
e1c8ea9241d0d65b3e17fc4a22046b71 | https://www.britannica.com/biography/Xiong-Shili | Xiong Shili | Xiong Shili
Xiong Shili, Wade-Giles romanization Hsiung Shih-li, (born 1885?, Hubei province, China—died 1968, Beijing), one of the outstanding figures of 20th-century Chinese philosophy. His ontological system is an original synthesis of Buddhist, Confucian, and Western motifs.
Xiong was an anti-Manchu revolutionary in early youth, but after the age of 30 he devoted himself wholly to philosophy. He first studied metaphysical idealism in the Yogacara school of Mahayana Buddhism. He then turned to Confucian tradition, finding basic insights in the Yijing (“Book of Changes”) and in the idealistic branch of neo-Confucianism. From Western thought Xiong gained an appreciation of analytic method and the idea of evolutionary change. Accepting elements from all these sources, he shaped his own ontological system, which he presented in the eight-volume Xinweishilun (“New Doctrine of Consciousness Only”), published in 1944. From 1925 until his retirement, he was professor of philosophy at Peking University.
Briefly, the system is as follows. The cosmos is one great whole. Its basic nature, which is that of mind, will, and consciousness, is constant and continuous. It is dynamic, a vast ever-running current of changes, in a process of perpetual transformation, producing all things. In this current, two factors are at work: (1) an integrating and consolidating tendency called “closing,” which gives rise to all becoming of physical things, and (2) a strong, directing, controlling tendency called “opening,” which is the operation of mind. No matter how great the variety of physical and mental happenings in the world, they are all one in the “original substance.” They are its functionings and manifestations, just as waves of the ocean are one and continuous with the ocean itself. “Original substance” is “original mind.” Spirit and matter are simply two aspects of its perpetual operation. “Original mind” is thus common to human beings, heaven, earth, and all things. Its continuous transitions create what is new, not capriciously but with all the orderliness and causal sequence that science discovers. In spirit it is ren (humanity, humaneness, human-heartedness), the inmost ethical nature of reality and all its functions—a concept faithful to basic Confucian tradition.
|
cd32fc715ad9dc22fa45965c32fa31e6 | https://www.britannica.com/biography/Xu-Jingye | Xu Jingye | Xu Jingye
…of the ruling class under Xu Jingye raised a serious rebellion at Yangzhou in the south, but this was speedily put down. The empress instituted a reign of terror among the members of the Tang royal family and officials, employing armies of agents and informers. Fear overshadowed the life of…
|
30016b5495e09077b97f4de4b98e693a | https://www.britannica.com/biography/Yahya | Yaḥyā | Yaḥyā
Yaḥyā, in full Yaḥyā Maḥmūd al-Mutawakkil, (born 1867, Yemen—died Feb. 17, 1948, Sanaa, Yemen), Zaydī imam of Yemen from 1904 to 1948.
When Yaḥyā was a child, Yemen was a province of the Ottoman Empire. His youth was spent in the service of his father’s administration, and, when his father died in 1904, Yaḥyā succeeded him as imam. The Yemenis had always resented Turkish rule, and Yaḥyā was soon able to assemble a potent military force. Sporadic warfare lasted until 1911, when he was able to force the Turks to recognize the autonomy of his personal rule over the Yemen. He remained loyal to the Turks when World War I broke out but did not take an active part in the hostilities. At the close of the war he was recognized as the independent ruler of the Yemen, but there was no agreement on just which territories composed the country.
Yaḥyā clashed with the British, who had a military base in Aden and who considered many of the neighbouring tribes to be under their protection. He also clashed with his Arab neighbours along the Red Sea coast in the province of Asir. War with the Saudis broke out in 1934, just after the conclusion of the treaty with Great Britain, and Yaḥyā suffered a decisive defeat. King Ibn Saʿūd was generous; he forced the imam to make no territorial concessions and permitted a reversion to the prewar status quo. Thereafter foreign affairs ceased to be a dominant concern, and Yaḥyā directed his attention mostly to stabilization at home.
The hallmark of his rule was isolation from the outside world. His military power was based on the support of the Zaydī tribesmen of the interior highlands, while he administered the country through a small class of nobles known as sayyids. Yaḥyā himself secured what amounted to a monopoly of Yemen’s foreign trade. He was most concerned that no foreign influences disrupt this delicate equilibrium. He received some economic and military aid from the Italians in the 1920s and ’30s but firmly refused close contacts, such as an exchange of diplomatic missions. During World War II he remained neutral, but trouble began afterward, when the British strengthened their position in Aden and Yemenis who were discontented with Yaḥyā’s isolationist autocracy looked to them for support. Yemenis abroad also supported the domestic dissidents, but opposition did not become active until 1946. Two years later the aged imam was assassinated.
|
f157d6dc1a3d7c434f07a8a11edca069 | https://www.britannica.com/biography/Yahya-al-Qadir | Yaḥyā al-Qādir | Yaḥyā al-Qādir
But Yaḥyā al-Qādir (reigned 1075–92), al-Maʾmūn’s grandson, soon lost both Valencia and Córdoba. An alliance with Alfonso VI hastened the end of the Dhū an-Nūnid kingdom: while al-Qādir was briefly restored to Toledo, he bargained away his capital to the Christians in return for Valencia (1085),…
The weakness of al-Qādir, al-Maʾmun’s successor, permitted the Valencians to reassert their independence under the leadership of the Toledan governor, Abū Bakr, who allied himself with Alfonso VI of Leon and Castile. But when the latter took Toledo in 1085, he installed al-Qādir as puppet ruler in Valencia…
|
f622b5eb322063c83692cd8384851497 | https://www.britannica.com/biography/Yakubu-Gowon | Yakubu Gowon | Yakubu Gowon
Yakubu Gowon, also known as Jack Gowon, (born October 19, 1934, Pankshin, Nigeria), Nigerian military leader, who served as head of state (1966–75).
From Plateau state in the middle belt of Nigeria, Gowon’s father was an early convert to Christianity. Gowon was educated in Zaria and later became a career army officer. He was trained in Ghana and in England at Sandhurst and twice served in the Congo region as part of Nigeria’s peacekeeping force there in the early 1960s. After the coup of January 1966, he was appointed chief of staff by Major General Johnson Aguiyi-Ironsi, the new leader. Northern officers staged a countercoup in July 1966, and Gowon emerged as the compromise head of the new government.
Gowon tried to resolve the ethnic tensions that threatened to fatally divide Nigeria. Although he was eventually successful in ending attacks against Igbo in the north, he was unable to affect a more lasting peace. In a final attempt to resolve the conflict, on May 27, 1967, Gowon declared a state of emergency and divided Nigeria’s four regions into 12 states. Three days later the Eastern region declared itself the independent state of Biafra with Odumegwu Ojukwu as its leader; armed conflict began in July.
Gowon directed government forces to remember that they were essentially fighting Nigerians, who were to be encouraged to rejoin the country. He also allowed a team of international observers to monitor the conduct of his troops. After the government victory in January 1970, a remarkable reconciliation took place between victors and vanquished, largely attributable to Gowon’s personal influence. By the mid-1970s Gowon was emerging as an international leader and was involved in the establishment of the Economic Community of West African States (ECOWAS). On July 29, 1975, however, while Gowon was in Uganda for an Organization of African Unity summit meeting, the army removed him from office.
Gowon was exiled to Great Britain. He was stripped of his rank for allegedly participating in the assassination of his successor, Murtala Mohammed, in 1976. He was pardoned by Shehu Shagari in 1981, and his rank was restored by Ibrahim Babangida in 1987. Having earned a Ph.D. at Warwick University in 1983, he became a professor of political science at the University of Jos in the mid-1980s and attained the status of an elder statesman of Nigerian politics.
|
1bd81abcc0b78910e8cc3ecc33011a3e | https://www.britannica.com/biography/Yamashita-Tomoyuki | Yamashita Tomoyuki | Yamashita Tomoyuki
Yamashita Tomoyuki, also called Yamashita Hōbun, byname Tiger Of Malaya, (born Nov. 8, 1885, Kōchi, Japan—died Feb. 23, 1946, Manila, Phil.), Japanese general known for his successful attacks on Malaya and Singapore during World War II.
After graduating from the Army Academy (1905) and the Army War College (1916), Yamashita was an officer for the Army General Staff Office. He rose rapidly through the ranks of the Imperial Army, eventually becoming the highest-ranking general of its air force.
An able strategist, he trained Japanese soldiers in the technique of jungle warfare and helped conceive the military plan for the Japanese invasion of the Thai and Malay peninsulas in 1941–42. In the course of a 10-week campaign, Yamashita’s 25th Army overran all of Malaya and obtained the surrender of the huge British naval base at Singapore on Feb. 15, 1942. Soon afterward Yamashita was retired by Prime Minister Tojo Hideki to an army training command in Manchuria, and he did not see active service again until after Tojo’s fall in 1944, when he was sent to command the defense of the Philippines. His forces were badly defeated in both the Leyte and the Luzon campaigns, but he held out until after the general surrender was announced from Tokyo in August 1945. Yamashita was tried for war crimes, and, though he denied knowing of atrocities committed under his command, he was convicted and eventually hanged.
|
140db4c8301102ae360d803cdafbf571 | https://www.britannica.com/biography/Yang-Zhu | Yang Zhu | Yang Zhu
Yang Zhu, Wade-Giles romanization Yang Chu, (born 440, China—died 360? bce, China), Chinese philosopher traditionally associated with extreme egoism but better understood as an advocate of naturalism. He may also have been the first Chinese philospher to discuss human nature (xing; literally “natural tendencies”).
When asked whether he would surrender merely one hair from his body in order to save humanity, Yang Zhu replied that “mankind is surely not to be helped by a single hair.” The Confucian philosopher Mencius (Mengzi; c. 371–289 bce), who promoted a conception of society and government based on family ties, condemned Yang’s doctrines of keeping one’s nature intact and protecting one’s body as an example of radical individualism that subverted the natural order of human relationships. Confucian tradition, the state orthodoxy from the Han dynasty (206 bce–220 ce) through the Qing dynasty (1644–1911/12), sustained Mencius’s critique.
Yang Zhu’s naturalism is evident in his belief in giving life “its free course” while “neither checking nor obstructing it.” Yang felt that human beings should live pleasurably, which for him implied a life in which both selfish inaction and selfless intervention in human affairs would be contrary extremes; instead, one should lead a natural life by cultivating and following one’s innate natural tendencies. Yang’s purported refusal to save the world by sacrificing one hair did not promote the principle of “everyone for himself,” as Mencius believed. Rather, Yang asserted that intentional social actions, regardless of motivation, disrupt and divert the natural course of one’s life and result in more harm than good.
Little is known about him beyond the information provided in several sources that mention his teachings, most notably the seventh chapter of the Daoist work Liezi, which is attributed to a philosopher of that name (flourished 4th century bce) but dates in its current form to about the 4th century ce. His thought was also an apparent influence on some of the later chapters of the philosophical and literary classic the Zhuangzi, which is attributed to a Daoist sage of that name (flourished 4th century bce).
|
36828a1102eedc3cac0ae7f0a04b64cd | https://www.britannica.com/biography/Yanka-Kupala | Yanka Kupala | Yanka Kupala
…Harun, Vladimir Zylka, Kazimir Svayak, Yanka Kupala, and Yakub Kolas and the prose writers Zmitrok Byadulya and Maksim Haretski. Many of these writers had been contributors to the influential Belarusian newspaper Nasha Niva (“Our Field”), published in Vilnius during the period 1906–16. Of crucial importance for an understanding of the…
|
c8f7ff599d8d8b9b6f72f98b3e1cb7df | https://www.britannica.com/biography/Yannick-Nezet-Seguin | Yannick Nézet-Séguin | Yannick Nézet-Séguin
Yannick Nézet-Séguin, (born March 6, 1975, Montreal, Quebec, Canada), Canadian conductor and pianist who was music director of the Philadelphia Orchestra (2012– ), which he was credited with revitalizing through a dynamic mixture of “music-making and diplomacy,” and of the Metropolitan Opera (2018– ) in New York City.
As a young man, Nézet-Séguin studied piano, conducting, composition, and chamber music at the Quebec Conservatory in Montreal and choral conducting at the Westminster Choir College in Princeton, New Jersey. Later he thrived under the guidance of several well-known conductors, most notably Italian maestro Carlo Maria Giulini. Nézet-Séguin became known for his remarkable musicianship, dedication, and enthusiasm.
In 2012 he was appointed music director of the Philadelphia Orchestra. At the time, the ensemble was emerging from bankruptcy, and Nézet-Séguin sought to “reconnect the orchestra to its community.” To this end the orchestra, in addition to performing its scheduled season concerts at Philadelphia’s Kimmel Center for the Performing Arts, appeared at impromptu shows around the city—in shipyards, churches, and schools.
Elsewhere Nézet-Séguin served as artistic director and principal conductor (2000– ) of the Orchestre Métropolitain (Montreal) and as music director (2008–18) of the Rotterdam (Netherlands) Philharmonic Orchestra; he assumed the role of honorary conductor of the Rotterdam Philharmonic in 2018. In addition, he worked with many other fine ensembles in Europe, including the Dresden Staatskapelle, the Berlin Philharmonic Orchestra, the Berliner Staatskapelle, the Bavarian Radio Symphony Orchestra, the Vienna Philharmonic Orchestra, and the Chamber Orchestra of Europe.
Nézet-Séguin was also a notable opera conductor. He had regular engagements at New York City’s Metropolitan Opera, and he also appeared at the Salzburg Festival, La Scala (Milan), the Royal Opera House (Covent Garden, London), and the Dutch National Opera (Amsterdam). In 2016 he was appointed to succeed James Levine as music director of the Metropolitan Opera. However, because the conductor’s schedule was so busy and because his contract was still intact with the Philadelphia company, he could not commit to the new position until the 2020–21 season. Beginning in 2017, he maintained the two posts in tandem and assumed the title music director designate of the Metropolitan Opera, conducting two operas there each season. Nézet-Séguin later adjusted his schedule and took up the role of music director for the 2018–19 season, two years earlier than planned.
At a time when few conductors had personal recording contracts, Nézet-Séguin enjoyed an open-ended agreement with Deutsche Grammophon. His extensive discography included numerous recordings for the German label, including the 2015 Rachmaninov Variations with Daniil Trifonov and the Philadelphia Orchestra.
|
9178215c0155fa81ab7a25cf2dc56703 | https://www.britannica.com/biography/Yaqub-Khan | Yaʿqūb Khan | Yaʿqūb Khan
The Treaty of Gandamak (Gandomak; May 26, 1879) recognized Yaʿqūb Khan as emir, and he subsequently agreed to receive a permanent British embassy at Kabul. In addition, he agreed to conduct his foreign relations with other states in accordance “with the wishes…
…with the former emir’s son, Yaʿqūb Khan. Yaʿqūb Khan promised, in exchange for British support and protection, to admit to his Kabul court a British resident who would direct Afghan foreign relations, but the resident, Sir Louis Cavagnari, was assassinated on September 3, 1879, just two months after he arrived.…
…1879, recognized Shīr ʿAlī’s son, Yaʿqūb Khan, as emir. He subsequently agreed to receive a permanent British embassy at Kabul. In addition, he agreed to conduct his foreign relations with other states in accordance “with the wishes and advice” of the British government.
Yaʿqūb Khan promised, in exchange for British support and protection, to admit to his Kabul court a British resident who would direct Afghan foreign relations, but the resident, Sir Louis Cavagnari, was assassinated on September 3, 1879, just two months after he arrived. British troops…
|
fcbe2bebac5111c8c34d03ff7fe1b2db | https://www.britannica.com/biography/Yasser-Arafat/From-agreement-to-the-second-intifadah | From agreement to the second intifāḍah | From agreement to the second intifāḍah
On October 30, 1991, following the Persian Gulf War, the Madrid Conference—a peace conference including Arab countries, Palestinians, and Israel—opened under the joint presidency of the United States and the Soviet Union. There the Palestinians were represented not through the PLO—which the Israeli government refused to deal with—but through a joint Jordanian-Palestinian delegation led by Palestinians from the occupied territories. Although the Madrid talks themselves failed to achieve a substantive agreement, they were valuable in paving the way for additional negotiations. Among these was a secret channel of negotiations in Oslo, held beginning in January 1993 between PLO and Israeli officials, which produced an understanding known as the Oslo Accords. In September 1993 Arafat and Israeli Prime Minister Yitzhak Rabin exchanged letters in which Arafat, as head of the PLO, formally recognized “the right of the State of Israel to exist in peace and security” while Rabin recognized the PLO as the “representative of the Palestinian people” and made clear Israel’s intention to begin negotiations with the organization. On September 13, 1993, Arafat, Rabin, and U.S. Pres. Bill Clinton signed the Declaration of Principles on Palestinian Self-Rule in Washington, D.C. The Israeli-PLO accord, also known as Oslo I, envisioned the gradual implementation of Palestinian self-government in the West Bank and Gaza Strip for a transitional period not exceeding five years and leading to a permanent settlement based on UN Security Council Resolutions 242 and 338. The following year Arafat returned to the Gaza Strip and began implementing Palestinian self-rule.
The provisions of the Declaration of Principles were enacted on May 4, 1995, by a pact signed by Arafat and Rabin in Cairo. Several months later, in September 1995, Rabin, Arafat, and Israeli foreign minister Shimon Peres—all newly named winners of the Nobel Peace Prize—signed the Interim Agreement on the West Bank and Gaza Strip (often called Oslo II). The agreement established a schedule for Israeli withdrawals from the Palestinian population centres (to be implemented in several stages) and created a complex system of zones that were divided between areas fully controlled by the Palestinians, those under Palestinian civil authority but Israeli military control, and those exclusively under Israeli control. It also set elections for a president and council of the Palestinian Authority, which would govern the Palestinian population in the occupied territories, and on January 20, 1996, Arafat was elected president of the PA. With a turnout of close to 80 percent, Arafat won 88 percent of the vote.
Relations with Rabin had remained respectful, even if they were sometimes difficult—especially on the sensitive subject of Israel’s ongoing settlement activity. But with Rabin’s assassination by a Jewish extremist in November 1995 and the election in May 1996 of Benjamin Netanyahu—leader of the Likud, a right-wing political party, and an opponent of the Oslo Accords—as prime minister, relations grew strained. Negotiations became deadlocked, even after an intervention by Clinton, who arranged a summit meeting with the two leaders at the Wye Plantation in eastern Maryland in 1998. Negotiations were revived after the election of Israel Labour Party leader Ehud Barak as prime minister in 1999, but in a very tense context. The unabated continuation of settlement activity—some 100,000 more settlers arrived in the West Bank between 1993 and 2000 (without taking Jerusalem into account)—created great discontent among the Palestinians and strengthened the Ḥamās opposition to the Oslo Accords. For his part, Arafat proved unable to create the structures of an independent state (for reasons linked with his own shortcomings and with the fact that most of the West Bank and Gaza Strip were still occupied).
In July 2000 Clinton convened a summit at Camp David in northern Maryland, where the historic Camp David Accords between Israel and Egypt had been negotiated in 1978. The aim was to find a final agreement to the Israeli-Palestinian conflict after five years of Palestinian self-rule. The summit was hastily prepared, however, and, since the most contentious issues—the question of the right of return for Palestinian refugees, control of Jerusalem, borders, and Jewish settlements in the West Bank and Gaza Strip—were being discussed for the first time, it was unlikely that these sensitive and complex matters would be resolved quickly. From the beginning, Arafat was suspicious of the summit and its timing, and although some progress was made, in the end there was no final settlement.
Negotiations continued after the failure at Camp David, but a visit by Likud leader Ariel Sharon to the Temple Mount in Jerusalem in September 2000 sparked the second intifāḍah, and the dwindling talks ground to a halt. A spiral of harsh repression by the Israeli army and violence by different armed Palestinian groups subsequently led to both sides’ total loss of confidence in the peace process. In spite of the January 2001 negotiations at Ṭābā, Egypt, which were held independently of the United States and made important progress, the Barak government lost the February 2001 general elections and Sharon—a strong opponent of both the Oslo Accords and the creation of a Palestinian state—was elected prime minister. “We have no partner for peace” was once more the general sentiment of many Israeli political parties.
Arafat lost much of his diplomatic credibility with the West after the election of U.S. Pres. George W. Bush in November 2000 and the launch of the “war on terror” in 2001, which followed the September 11 attacks on the World Trade Center in New York City and the Pentagon in Washington, D.C. In 2001, following suicide attacks in Israel that Sharon blamed Arafat for instigating, Arafat was confined by Israel to his headquarters in Ramallah. In October 2004 Arafat fell ill and was transported to Paris for medical treatment, where he died the following month. Fatah later passed a unanimous resolution that held Israel responsible for Arafat’s death.
Many of Arafat’s supporters doubted that he had died a natural death, their suspicions being fueled in part by the doctors’ inability to identify the origin of his illness and the lack of an autopsy, and rumours circulated that he had died from poisoning. These suspicions surfaced again in July 2012 when a Swiss laboratory announced that it had discovered elevated levels of polonium-210 on some of Arafat’s clothes and personal belongings. French prosecutors launched a murder investigation later that year in response to a request by Arafat’s widow. In November 2012 Arafat’s remains were exhumed so that teams of Swiss, Russian, and French experts could test for signs of poisoning.
The results of the separate investigations, released in late 2013, were contradictory. The Russian report was released first and found no traces of polonium-210. The Swiss and French results both found abnormally high levels of polonium-210 but disagreed on how it got into his remains: the Swiss study concluded that poisoning could not be ruled out, while the French study concluded that the presence of polonium-210 could be explained as environmental in origin.
No report went without controversy, moreover. There were claims of outright interference in the Russian investigation, including claims that the Russian scientists were instructed on what the outcome should be. The Swiss report cited a number of intervening factors that effected uncertainty in the interpretation of the results, including the length of time since the death and the incomplete “chain of custody” of Arafat’s clothes and belongings studied in the earlier investigation. Palestinian officials and Arafat’s widow dismissed the French conclusions as “politicized.”
An assessment of the personality of Yasser Arafat must take into consideration both his deep religiosity and his fierce nationalism (even if he tended to equate Palestinian nationalism with himself). He often said that he was married to the Palestinian cause, and indeed he had no other bride—at least until he married Suhā al-Ṭawīl, a Sorbonne-educated Palestinian woman of Christian origin, in 1990. He customarily worked late into the night, sometimes receiving leaders and journalists well after midnight. He lived in modest fashion—even as he provided supporters with money and costly favours, purchased influence, and accepted the corruption of many of those around him—and, in spite of criticism of his authoritarian style of governing, he managed to gain a wide popularity among his people. His opponents—both in Israel and in the Arab world—were numerous, however, and Arafat escaped so many assassination attempts through the years that his intuition and resilience became a sort of legend.
To assess Arafat’s life as a whole is no easy task. He succeeded in putting the Palestinians back on the political map after their disastrous uprooting in the middle of the 20th century. He was also able to maintain the unity of a cohesive Palestinian organization in spite of interference from neighbouring Arab states. But Arafat’s shortcomings in building solid state institutions after 1993 were matched by his shortcomings in understanding the Israeli public and its fears. At the end of his life he had reached a state of complete diplomatic isolation—and yet, as Ḥamās and Fatah continued to vie for influence in the occupied territories in the years after his death, it looked as though history might find that he was the last Palestinian leader able to sign a peace agreement and impose it on the Palestinian community as a whole.
|
010bc7fd1f0153c2b1a0191e5820aa36 | https://www.britannica.com/biography/Yayoi-Kusama | Yayoi Kusama | Yayoi Kusama
Yayoi Kusama, (born March 22, 1929, Matsumoto, Japan), Japanese artist who was a self-described “obsessional artist,” known for her extensive use of polka dots and for her infinity installations. She employed painting, sculpture, performance art, and installations in a variety of styles, including Pop art and Minimalism.
Yayoi Kusama was born on March 22, 1929, in Matsumoto, Japan.
Yayoi Kusama is a Japanese artist known for her extensive use of polka dots and for her infinity installations. Notable works include Obliteration Room (2002–present) and Infinity Mirror Room—Phalli’s Field (1965/2016), the first of many distinct iterations.
Yayoi Kusama staged several unauthorized performances in New York City during the 1960s that drew the attention of the press, notably Grand Orgy to Awaken the Dead (1969), wherein she painted dots on participants’ naked bodies at a museum. Her career had a revival in the mid-2010s with several exhibitions featuring her Infinity Mirror Rooms.
By her own account, Kusama began painting as a child. She had little formal training, studying art only briefly (1948–49) at the Kyōto City Specialist School of Arts.
Yayoi Kusama was born the youngest daughter of an affluent family. She indicated that her mother was physically and verbally abusive, while her father was a womanizer. Although she had relationships with fellow artists, she never married or had children.
By her own account, Kusama began painting as a child, at about the time she began experiencing hallucinations that often involved fields of dots. Those hallucinations and the theme of dots would continue to inform her art throughout her career. She had little formal training, studying art only briefly (1948–49) at the Kyōto City Specialist School of Arts. Family conflict and the desire to become an artist drove her to move in 1957 to the United States, where she settled in New York City. Before leaving Japan, she destroyed many of her early paintings.
Her early work in New York City included what she called “infinity net” paintings. Those consisted of thousands of tiny marks obsessively repeated across large canvases without regard for the edges of the canvas, as if they continued into infinity. Such works explored the physical and psychological boundaries of painting, with the seemingly endless repetition of the marks creating an almost hypnotic sensation for both the viewer and the artist. Her paintings from that period anticipated the emerging Minimalist movement, but her work soon transitioned to Pop art and performance art. She became a central figure in the New York avant-garde, and her work was exhibited alongside that of such artists as Donald Judd, Claes Oldenburg, and Andy Warhol.
Obsessive repetition continued to be a theme in Kusama’s sculpture and installation art, which she began to exhibit in the early 1960s. The theme of sexual anxiety linked much of that work, in which Kusama covered the surface of objects, such as an armchair in Accumulation No. 1 (1962), with small soft phallic sculptures constructed from white fabric. Installations from that time included Infinity Mirror Room—Phalli’s Field (1965), a mirrored room whose floors were covered with hundreds of stuffed phalli that had been painted with red dots. Mirrors gave her the opportunity to create infinite planes in her installations, and she would continue to use them in later pieces.
Mirroring the times, Kusama’s performance art explored antiwar, antiestablishment, and free-love ideas. Those Happenings often involved public nudity, with the stated intention of disassembling boundaries of identity, sexuality, and the body. In Grand Orgy to Awaken the Dead (1969), Kusama painted dots on participants’ naked bodies in an unauthorized performance in the fountain of the sculpture garden of New York’s Museum of Modern Art. Critics accused her of intense self-promotion, and her work was regularly covered in the press; Grand Orgy appeared on the front page of the New York Daily News.
Kusama moved back to Japan in 1973. From 1977, by her own choice, she lived in a mental hospital. She continued to produce art during that period and also wrote surreal poetry and fiction, including The Hustlers Grotto of Christopher Street (1984) and Between Heaven and Earth (1988).
Kusama returned to the international art world in 1989 with shows in New York City and Oxford, England. In 1993 she represented Japan at the Venice Biennale with work that included Mirror Room (Pumpkin), an installation in which she filled a mirrored room with pumpkin sculptures covered in her signature dots. Between 1998 and 1999 a major retrospective of her works was shown at the Los Angeles County Museum of Art, the Museum of Modern Art in New York City, the Walker Art Center in Minneapolis, Minnesota, and Tokyo’s Museum of Contemporary Art. In 2006 she received the Japan Art Association’s Praemium Imperiale prize for painting. Her work was the subject of a major retrospective at the Whitney Museum of American Art in New York City in 2012, and a traveling exhibition attracted record crowds at the Hirshhorn Museum and Sculpture Garden in Washington, D.C., in 2017. The latter show featured a sample of Kusama’s Infinity Mirrored Rooms, installations usually comprising a mirrored room with hundreds of coloured lights, and the works soon became some of her most popular pieces. That year she opened a museum dedicated to her work in Tokyo, near her studio and the psychiatric hospital where she lived.
|
05eba8660948f19a22b04414a0a23436 | https://www.britannica.com/biography/Ye-Jianying | Ye Jianying | Ye Jianying
Ye Jianying, Wade-Giles romanization Yeh Chien-ying, original name Ye Yiwei, (born April 28, 1897, Meixian, Guangdong province, China—died Oct. 22, 1986, Beijing), Chinese communist military officer, administrator, and statesman who held high posts in the Chinese government during the 1970s and ’80s.
Born of a middle-class family, Ye graduated from the Yunnan Military Academy in 1919 and joined Sun Yat-sen’s Nationalist movement shortly thereafter. He established a lifelong friendship with Zhou Enlai when the two were on the faculty of the Whampoa (Huangpu) Military Academy during the mid-1920s. He joined the Chinese Communist Party (CCP) in 1927 and studied in Moscow from early 1929 to late 1930, subsequently joining Mao Zedong’s Jiangxi Soviet. Ye helped plan the Long March (1934–35), and by the late 1930s he had earned a reputation as an outstanding strategic planner. He was chief of staff of the (communist) Eighth Route Army during much of World War II and became a member of the Central Committee of the CCP in 1945. During the civil war between the communists and Nationalists (1945–49), he was deputy chief of the general staff of the communist armed forces.
Ye was the chief political commissar in Guangdong province in the early 1950s and was also mayor of Guangzhou (Canton) at this time. In 1955 he was made a marshal of the People’s Liberation Army, and in 1966 he was made a member of the ruling Political Bureau (Politburo) of the CCP. He became a member of the powerful Standing Committee of the Political Bureau in 1973. After Mao’s death in 1976, Ye opposed the Gang of Four and supported Hua Guofeng. Ye served as defense minister from 1975 to 1978 but, having grown feeble from old age, was in the latter year made chairman of the Standing Committee of the National People’s Congress, thereby becoming nominal chief of state. He generally opposed the reforms of Deng Xiaoping, and in 1985 he retired from his principal posts, including his membership in the Political Bureau.
|
6c4385ce09664eb8e35cec8ac53156fc | https://www.britannica.com/biography/Yehuda-Amichai | Yehuda Amichai | Yehuda Amichai
Yehuda Amichai, (born May 3, 1924, Würzburg, Germany—died September 22, 2000, Jerusalem, Israel), Israeli writer who is best known for his poetry.
Amichai and his Orthodox Jewish family immigrated to Palestine in 1936. During World War II he served in the British army, but he later fought the British as a guerrilla prior to the formation of Israel; he also was involved in the Arab-Israeli conflicts of 1956 and 1973. Amichai attended the Hebrew University of Jerusalem and taught for several years at secondary schools.
Amichai’s poetry reflects his total commitment to the state of Israel, and from his first collection, Akhshav u-ve-yamim aḥerim (1955; “Now and in Other Days”), he employed biblical images and Jewish history. He also compared modern times with ancient, heroic ages and sought to expand biblical language in order to encompass contemporary phenomena. In the 1970s he introduced sexuality as a subject in his poems. With Amen (1977) he garnered a wider audience through the translation of his poems into English by Ted Hughes. Influenced by modern American and English poets, including W.H. Auden, Amichai was noted for his lyrical use of everyday language and the simplicity of his work. The English-language collection The Selected Poetry of Yehuda Amichai (1986) contains selections from his many publications in Hebrew.
In addition to short stories and plays, Amichai also wrote novels, of which the best known is Lo me-achshav, lo mi-kan (1963; Not of This Time, Not of This Place), about the quest for identity of a Jewish immigrant to Israel. Gam ha-ʾegrof hayah paʿam yad petuḥah (1989; Even a Fist Was Once an Open Palm with Fingers) is a selection of his poetry in translation. Open Closed Open (2000) continued to explore the Israeli experience.
|
3e67460edd99be3df7ffee829f42a695 | https://www.britannica.com/biography/Yelena-Glinskaya | Yelena Glinskaya | Yelena Glinskaya
…her apparent barrenness, he married Yelena Glinskaya, who bore him only two children—the deaf and mute Yury and the sickly Ivan, who was three years old at Vasily’s death in 1533.
|
4ad31c4053b9ac075e9b2f63dfaf5473 | https://www.britannica.com/biography/Yemelyan-Pugachev | Yemelyan Pugachev | Yemelyan Pugachev
Yemelyan Pugachev, in full Yemelyan Ivanovich Pugachev, Pugachev also spelled Pugachov, (born c. 1742, Zimoveyskaya-na-Donu, Russia—died January 21 [January 10, Old Style], 1775, Moscow), leader of a major Cossack and peasant rebellion in Russia (Pugachev Rebellion, 1773–75).
An illiterate Don Cossack, Pugachev fought in the Russian army in the final battles of the Seven Years’ War (1756–63), in Russia’s campaign in Poland (1764), and in the Russo-Turkish War of 1768–74. Following the siege and conquest of Bendery (1769–70), however, he returned home as an invalid. For three years after his recovery, he wandered, particularly among settlements of Old Believers, a dissident religious group that exercised considerable influence over him.
Learning in the course of his travels of the Yaik (Ural) Cossack Rebellion of 1772 and of its cruel suppression, Pugachev proceeded to Yaitsky Gorodok (now Oral), where the Cossacks remained discontented. Although he was arrested there for desertion from the army, imprisoned at Kazan, and sentenced to be deported to Siberia, he escaped and in June 1773 appeared in the steppes east of the Volga River. Claiming to be Emperor Peter III (who had been deposed by his wife, Catherine the Great, and assassinated in 1762), Pugachev decreed the abolition of serfdom and gathered a substantial following, including Yaik Cossacks, peasant workers in the mines and factories of the Urals, agricultural peasants, clergymen, and the Bashkirs. Planning ultimately to depose Catherine, Pugachev stormed and laid siege to Orenburg, an important commercial and industrial centre of the Ural region (fall 1773).
As the landowners of the region, fearing for their lives, fled to Moscow, Catherine recognized the seriousness of the rebellion and sent an army commanded by Gen. A.I. Bibikov against Pugachev (January 1774). In the spring Bibikov defeated Pugachev at Tatishchevo, west of Orenburg, but Pugachev proceeded to Kazan and burned the city (July 1774). He was defeated again several days later, but he crossed the Volga River, intending to gather reinforcements among the Don Cossacks. He captured Saratov (August 1774) and besieged Tsaritsyn (now Volgograd), where Gen. A.V. Suvorov finally defeated him (September 3 [August 23, Old Style], 1774). Pugachev escaped but was betrayed by some Yaik Cossacks, sent to Moscow, and executed.
|
4e04816b6ea95220eb53f6cdded74ced | https://www.britannica.com/biography/Yemi-Bisiri | Yemi Bisiri | Yemi Bisiri
Yemi Bisiri made lost-wax brass figures for the Ogboni cult, but in a contemporary style. Jinadu Oladepo created brass figures and bracelets and pendants that were worn by the Oshogbo artists as a kind of insignia. Senabu Oloyede and Kikelomo Oladepo both worked in cloth…
|
9bed2c6f95bc2195a765db81d14bf007 | https://www.britannica.com/biography/Yermak-Timofeyevich | Yermak Timofeyevich | Yermak Timofeyevich
Yermak Timofeyevich, also spelled Ermak Timofeevich, (died Aug. 6, 1584/85, Siberia), Cossack leader of an expeditionary force during Russia’s initial attempts to annex western Siberia. He became a hero of Russian folklore.
In 1579 the merchant and factory-owning Stroganov family enlisted the assistance of Yermak and a band of Cossacks to force Siberian tribesmen to cooperate with the Russians’ plan to extract natural resources from Siberia. (The tsar had granted the family a charter to the land in 1558.) Yermak set out with an expeditionary force of 840 men on Sept. 1, 1581, and in the spring of 1582 reached the central regions of the Tatar khanate of Sibir, whose head, Kuchum, ruled over the local tribes. Because his men had firearms Yermak was able to defeat the numerically superior forces of Khan Kuchum and occupy the capital, Kashlyk (or Sibir), on the Irtysh River.
Although the tsar sent Yermak another 500 men, resistance flared on all sides. In August 1584 (or 1585) Kuchum attacked and destroyed a small party of Cossacks led by Yermak, who, fighting his way to the boats, was drowned in the Irtysh, apparently by the weight of the coat of chain mail sent to him by the tsar.
|
711fa7cb2418ba9532948bf198b11032 | https://www.britannica.com/biography/Yevgeny-Abramovich-Baratynsky | Yevgeny Abramovich Baratynsky | Yevgeny Abramovich Baratynsky
Yevgeny Abramovich Baratynsky, Baratynsky also spelled Boratynsky, (born February 19 [March 2, New Style], 1800, Mara, Russia—died June 29 [July 11], 1844, Naples, Kingdom of Naples [Italy]), foremost Russian philosophical poet contemporary with Aleksandr Pushkin. In his poetry he combined an elegant, precise style with spiritual melancholy in dealing with abstract idealistic concepts.
Of noble parentage, Baratynsky was expelled from the imperial corps of pages, entered the army, was commissioned, and retired in 1826. He married and settled at Muranovo, near Moscow. His early romantic lyrics are strongly personal, dreamy, and disenchanted. His narrative poems Eda (1826), Bal (1828; “The Ball”), and Nalozhnitsa (1831; “The Concubine”; rewritten as Tsyganka, “The Gypsy Girl,” 1842) treat the emotions analytically. Tsyganka was attacked by critics of the time as “base” and “coarse.” The poem Na smert Gyote (1832; “On the Death of Goethe”) is one of his masterpieces. Tragic pessimism dominates his later poetry, which is mainly on philosophical and aesthetic themes. Modern critics value his thought more highly than did his contemporaries.
|
5513669e493dcdef805cd6247659bcee | https://www.britannica.com/biography/Yevgeny-Lifshitz | Yevgeny Lifshitz | Yevgeny Lifshitz
In 1956 Russian physicist Yevgeny Lifshitz applied Casimir’s work to materials with different dielectric properties and found that in some cases the Casimir effect could be repulsive. In 2008 American physicist Jeremy Munday and Italian American physicist Federico Capasso first observed the repulsive Casimir effect between a gold-plated polystyrene
|
c2c1c55ddac4800b1a3df02ba67257a2 | https://www.britannica.com/biography/Yevno-Azef | Yevno Azef | Yevno Azef
…when it was disclosed that Yevno Azef, longtime head of the terrorist wing of the Socialist Revolutionary Party, was also an employee of the department of police and had for years been both betraying his revolutionary colleagues and organizing the murders of his official superiors.
|
2c9f30196f08360eb68d493f089d57f1 | https://www.britannica.com/biography/Yi-Sun-shin | Yi Sun-shin | Yi Sun-shin
Yi Sun-shin, also spelled Yi Sun-sin, (born April 28, 1545, Seoul, Korea [now in South Korea]—died Dec. 16, 1598, off Noryang), Korean admiral and national hero whose naval victories were instrumental in repelling Japanese invasions of Korea in the 1590s.
After passing the government examinations to become a military officer in 1576, Yi served at various army and navy posts. Although he was twice discharged after being falsely accused by jealous colleagues, in 1591 he was appointed commander of the naval forces in Left Chŏlla province, where he concentrated on training his men, stocking equipment and supplies, and developing the renowned kŏbuksŏn (“turtle ship”). The kŏbuksŏn is thought to have been the first ironclad battleship in history. Its upper deck was covered with armoured plates to protect its crew, and spikes and knives were attached to the plates to discourage enemies from boarding. The ship’s bow was equipped with a dragon head through which cannon could be fired and clouds of smoke could be emitted to obscure the ship’s position. Cannon and guns could also be fired from the stern and the sides of the ship.
As a result of Yi’s preparations, his forces, unlike most of the Korean military, were ready to fight when the Japanese invaded in 1592. Yi’s victories off the southern coast effectively cut off the Japanese troops in Korea from supplies and reinforcements and prevented the Japanese from pressing their initial advantage. In 1593 Yi was given command of the entire Korean fleet, but, following peace negotiations, in 1597 he was again falsely accused of disloyalty and demoted to the rank of common soldier. The Japanese then launched a second invasion and succeeded in destroying almost all of the Korean navy. Yi was reinstated as commander of the few remaining ships and, continuing his undefeated battle record, soon restored Korea’s control of the seas. He was killed by a stray bullet as he pursued the retreating Japanese forces during the final campaign of the war.
|
261c62e6414a4a04459dcc575c745bec | https://www.britannica.com/biography/Yigael-Yadin | Yigael Yadin | Yigael Yadin
Yigael Yadin, original name Yigael Sukenik, (born March 21, 1917, Jerusalem—died June 28, 1984, H̱adera, Israel), Israeli archaeologist and military leader noted for his work on the Dead Sea Scrolls.
Yadin, the son of an archaeologist, was educated at Hebrew University (M.A., 1945; Ph.D., 1955). He was a member of the Haganah military organization from 1932 to 1948 and served as chief of the general staff of the Israel Defense Forces from 1949 to 1952. He was also deputy prime minister, 1977–81. Yadin, who was a leader of major archaeological expeditions in Israel, including those at Haẓor (1955–58; 1968), the Dead Sea Caves (1960–61), and Masada (1963–65), became professor of archaeology at Hebrew University in 1959. He received the laureate of Israel prize (1956) and the Rothschild humanities prize (1964).
Yadin’s writings centre upon his archaeological endeavours. They include The Message of the Scrolls (1957; new ed. 1962), Hazor, 3 vol. (1958–62), and The Art of Warfare in Biblical Lands in the Light of Archaeological Discovery, 2 vol. (1963). He is also the author of Masada: Herod’s Fortress and the Zealots’ Last Stand (1966).
|
a4aabd0a8828cb6806d1a364001561f3 | https://www.britannica.com/biography/Yinka-Shonibare | Yinka Shonibare | Yinka Shonibare
Yinka Shonibare, (born February 10, 1962, London, England), British artist of Nigerian heritage, known for his examination of such ideas as authenticity, identity, colonialism, and power relations in often-ironic drawings, paintings, sculptures, photographs, films, and installations. A signature element of his work is his use of so-called Dutch wax-printed fabric, produced by means of a batiklike technique. Exported from the Netherlands and elsewhere in Europe in the late 19th century, the brightly coloured patterned fabric was meant to imitate Indonesian cloth and was enthusiastically adopted in West Africa, so this inauthentic Indonesian textile produced in Europe became known as “African” cloth.
Shonibare was born to wealthy Nigerian parents living in London. When he was about three years old, his family returned to Nigeria, and he grew up in Lagos (then the capital of Nigeria) while continuing to summer in England. Although his parents were disappointed with his chosen career, he was allowed to return to England to attend art school. Just weeks after his classes began, Shonibare came down with transverse myelitis, a disorder caused by inflammation of the spinal cord. After being hospitalized for a year, he entered Byam Shaw School of Art (B.A., 1984–89; now part of Central Saint Martin’s College of Art and Design). He received an M.F.A. degree from Goldsmiths’ College (1991; now Goldsmiths, University of London).
Shonibare’s art was placed on its trajectory by the comments of one of his teachers, who asked him why he didn’t make “authentic African art.” As someone who had spoken Yoruba at home yet watched British and U.S. television, was perfectly fluent in English, and had lived in both England and urban Nigeria, the artist pondered the meaning of authenticity and the greater significance of his multicultural identity. Although Shonibare’s work was included in the 1997 traveling exhibition “Sensation: Young British Artists from the Saatchi Collection” and he was a contemporary of members of the so-called YBAs (Young British Artists), he considered his concerns to be quite different from theirs.
In part because of his experimentation with so many media, Shonibare’s art defies easy categorization. In paintings such as Double Dutch (1994), he created a large work by painting a rectangle on a wall and placing on it a grid of several small stretchers covered with the Dutch wax-printed fabric ubiquitous in his art. He then began using these textiles to create costumes in the Victorian style for mannequins. These brightly clothed mannequins sometimes were headless (Scramble for Africa, 2003) and sometimes had objects such as globes in place of human heads (Planets in My Head, Philosophy, 2011). In such works as Diary of a Victorian Dandy (1998; based on the narrative works of British artist William Hogarth), Shonibare created a series of photographs featuring himself as a dandy in a variety of tableaux. He also portrayed the protagonist of an Oscar Wilde novel in the photographic series Dorian Gray (2001). Many of Shonibare’s works made reference to paintings by earlier artists, among them Jean-Honoré Fragonard, Francisco de Goya, and Leonardo da Vinci. In the 21st century, Shonibare expanded his repertoire of techniques to include films (Un ballo in maschera, 2004, and Odile and Odette, 2005).
In 2004 Shonibare was nominated for the Turner Prize, and in 2005—somewhat ironically, considering his exploration of colonialism and empire—he was appointed MBE (Member of the Order of the British Empire); thereafter he presented himself professionally as “Yinka Shonibare MBE.” In 2010 his Nelson’s Ship in a Bottle won a commission to occupy Trafalgar Square’s Fourth Plinth. This competition evinced Shonibare’s growing interest in public art.
|
58cf0975e0b2abb80a1ba0e30f8441eb | https://www.britannica.com/biography/Yinreng | Yinreng | Yinreng
He nominated the second son, Yinreng, crown prince in 1675, at the age of little more than a year and a half; this was against the Manchu tradition of giving all sons equal rights of succession, and it resulted in vicious fights among Kangxi’s sons. The hapless Yinreng was deposed…
…emperor’s designated heir, his son Yinreng, was a bitter disappointment, and the succession struggle that followed the latter’s demotion was perhaps the bloodiest in Qing history. Many Chinese historians still question whether the Kangxi emperor’s eventual successor, his son Yinzhen (reign title Yongzheng), was truly the emperor’s deathbed choice. During…
|
685a39c765c0fd4139f939e0287abe63 | https://www.britannica.com/biography/Yitzhak-Zuckerman | Yitzhak Zuckerman | Yitzhak Zuckerman
Yitzhak Zuckerman, also spelled Itzhak Zuckerman or Yitzhak Cukierman, byname Antek, (born 1915, Warsaw, Pol.—died June 17, 1981, Tel Aviv, Isr.), hero of Jewish resistance to the Nazis in World War II and one of the few survivors of the Warsaw Ghetto Uprising.
Zuckerman was active in a federation of young Zionist organizations, Hehalutz, and early favoured armed resistance to Nazi depredations against the Jews. He was quick to interpret the first mass executions of Jews as the beginning of a systematic program of annihilation. Perceiving the full scope of Nazi plans and realizing that they had nothing left to lose, Zuckerman and resistance leaders such as Abba Kovner and Mordecai Anielewicz found the determination to resist and to risk their lives.
In March 1942 Zuckerman represented Hehalutz at a meeting of Zionist groups and urged the creation and arming of a defense organization. Others feared that resistance would provoke the Nazis to greater violence. But on July 28, soon after the first daily trainload of 5,000 Jews had left the Warsaw ghetto to be gassed at Treblinka, Jewish leaders accepted his view and created the Jewish Fighting Organization (Żydowska Organizacja Bojowa; ŻOB) under the leadership of Anielewicz. Zuckerman became one of his three co-commanders and also helped lead a political affiliate founded at the same time, the Jewish National Committee (Żydowski Komitet Narodowy). With numerous contacts in the underground resistance groups on the “Aryan side”—i.e., outside the ghetto—Zuckerman negotiated the gifts and black-market purchases of the pistols, grenades, and few rifles that the ŻOB obtained. He smuggled these, along with messages, into the ghetto through the Warsaw sewers.
When the Warsaw Ghetto Uprising broke out, Zuckerman was outside the ghetto and did what he could to spread the word of the plight of the ghetto’s remaining Jews to the Polish underground and to Poles and Jews abroad. He also smuggled in to the ŻOB any additional guns and grenades that could be found. After 20 days of battle, Anielewicz and his companions died when the Nazis overcame their command bunker, and Zuckerman returned to the ghetto to take charge. Before the end of the 28-day battle, he led some 75 ŻOB fighters, including his future wife, Zivia Lubetkin, through the sewers and into underground havens on the Aryan side.
Zuckerman continued to lead a Jewish band of guerrillas in the Polish underground and to alert Jewish leaders elsewhere to the situation of Jews inside Nazi Europe. At war’s end he organized underground transportation for Jewish refugees from Europe to Palestine, where he and Zivia settled in 1947. They, with others, were founders of the kibbutz Lohamei Hagetaot (Hebrew: “The Ghetto Fighters”), north of Haifa, where a memorial museum, Ghetto Fighters’ House, was established. Zuckerman and his wife were prosecution witnesses in the 1961 trial of Adolf Eichmann. Zuckerman was the author of A Surplus of Memory (1993; originally published in Hebrew, 1990).
Zuckerman was recognized as a hero for his efforts, but his heroism gave him little comfort. He began drinking after the war, and he suffered mental anguish. He told one interviewer, “If you could lick my heart, it would poison you.”
|
7140ee1778c53940ff030a38d587927d | https://www.britannica.com/biography/Yohannes-IV | Yohannes IV | Yohannes IV
Yohannes IV, English John IV, original name Kassa, (born 1831—died March 10, 1889, Metema, Sudan), emperor of Ethiopia (1872–89). Like his predecessor, Tewodros II (reigned 1855–68), Yohannes IV was a strong, progressive ruler, but he spent most of his time repelling military threats from Egypt, Italy, and the Mahdists of the Sudan.
Superior weaponry allowed Yohannes, a dejazmatch (earl) of Tigray in northern Ethiopia, to fight his way to the Ethiopian throne on January 21, 1872, four years after Tewodros’s death. His main rival was Menilek II, king of Shewa, who did not recognize Yohannes as emperor until 1878/79, after a military defeat. Menilek’s eclipse, however, was only temporary. In 1882 a dynastic marriage was arranged between Menilek’s daughter and Yohannes’ son, and it was agreed that Menilek would be Yohannes’ successor as emperor. Yohannes also recognized Menilek’s control of the south, and their separate spheres of influence were carefully defined. Tensions between the two rose again by 1888, however, when Menilek, fearing that Yohannes’ son might try to follow his father to the throne, made an agreement with the Italians in exchange for arms.
Aside from the recurrent problem of the powerful king of Shewa, Yohannes’ domestic concerns were mainly to reduce the power of the other regional nobles (and thus create a unitary government) and to increase his hold on his subjects through enforced conversion to the Ethiopian Orthodox church. His attempt to use religion as a basis for unity aroused resistance, however, particularly from Muslims who were ordered to build churches, pay tithes, and eventually be baptized.
The expansionist khedive (Ottoman viceroy) Ismāʿīl Pasha of Egypt posed the first external threat to Yohannes’ empire. By the mid-1870s Egypt had encroached on Ethiopia to the east and south, but Ethiopian forces, in what verged on an anti-Muslim crusade, won decisive victories in the mountainous country of the north in 1875 and 1876. Italy, the next aggressor, in 1885 occupied the former Turkish and Egyptian Red Sea port of Mitsiwa (now Massawa, Eritrea) and then began to expand inland toward the province of Tigray, only to be soundly defeated by Yohannes in 1887. In the same year, the Islamic revivalist Mahdist forces, gaining ground in the Sudan, invaded Ethiopia and devastated the old capital, Gonder. In retaliation, and possibly in the hope of getting Sudanese gold and slaves and even of gaining access to the Nile River, Yohannes invaded the Sudan and was killed in the Battle of Metema (March 1889).
|
1523c8c853dfecde7476016c02bd790d | https://www.britannica.com/biography/Yokomitsu-Riichi | Yokomitsu Riichi | Yokomitsu Riichi
Yokomitsu Riichi, also called Yokomitsu Toshikazu, (born March 17, 1898, Higashiyama Hot Springs, Fukushima prefecture, Japan—died Dec. 30, 1947, Tokyo), Japanese writer who, with Kawabata Yasunari, was one of the mainstays of the New Sensationalist school (Shinkankaku-ha) of Japanese writers, influenced by the avant-garde trends in European literature of the 1920s.
Yokomitsu began writing while still at Waseda University, Tokyo, which he left without graduating. In 1923 he joined the playwright Kikuchi Kan’s journal Bungei shunjū. In 1924 he joined Kawabata in publishing the journal Bungei jidai (both can be translated “Literary Age”). Yokomitsu’s story Atama narabi ni hara (“Heads and Bellies”), published there that year, was hailed as a new kind of writing. In opposition to the autobiographical legacy of naturalism and the social pleading of proletarian literature, Yokomitsu developed an aesthetic of sensual impressions presented in fresh, startling ways. Haru wa basha ni notte (1926; Spring Came on a Horse-Drawn Cart), dealing with his wife’s fatal illness, is a lyrical, sensitive story; Kikai (1930; Machine) shows his growing obsession with the idea of a mechanistic principle governing human behaviour. Concerned always with the theory of writing, he put forth his ideas in Junsui shōsetsu ron (1935; “On the Pure Novel”).
|
0681d440415a9b7c8fc21283db3a21b7 | https://www.britannica.com/biography/Yongle/Accession-to-the-throne | Accession to the throne | Accession to the throne
The accession brought terrible retribution to those who had most closely advised Jianwen. They and all their relatives were put to death. Before the purge ended, thousands had perished. The new emperor also revoked the institutional and policy changes of his nephew-predecessor and even ordered history rewritten so that the founding emperor’s era name was extended through 1402, as if the Jianwen emperor had never reigned at all. The one reform policy that remained in effect was that princely powers must be curtailed. Hence, the surviving frontier princes were successively transferred from their strategically located fiefs into central and south China and were deprived of all governmental authority. From the Yongle period on, imperial princes were no more than salaried idlers who socially and ceremonially adorned the cities to which they were assigned and in which they were effectively confined. No subsequent Ming emperor was seriously threatened by a princely uprising.
As the Yongle emperor, Zhu Di was domineering, jealous of his authority, and inclined toward self-aggrandizement. He staffed the central government with young men dependent on himself and relied to an unprecedented extent on eunuchs for service outside their traditionally prescribed palace spheres—as foreign envoys, as supervisors of special projects such as the requisitioning of construction supplies, and as regional overseers of military garrisons. In 1420 he established a special eunuch agency called the Eastern Depot (Dongchang) charged with ferreting out treasonable activities. Although it did not become notorious in his own reign, it came to be a hated and feared secret police in collaboration with the imperial bodyguard in later decades and centuries.
The Yongle emperor also relied heavily on a secretarial group of young scholar-officials assigned to palace duty from the traditional compiling and editing agency, the Hanlin Academy, and by the end of his reign they became a Grand Secretariat, a powerful buffer between the emperor and the administrative agencies of government. Although the emperor, like his father, was quick to anger and sometimes abused officials cruelly, he built a strong and effective administration, and during his reign China settled into the generally stable political and socioeconomic patterns that were to characterize the remainder of the dynasty.
Like his father, Yongle had little personal respect for the higher forms of Chinese culture. In the fashion of the Mongol khans, he summoned to China and highly honoured a Tibetan lama, and the strongest intellectual influence on him may have been that of a monk named Daoyan, a long-favoured personal adviser. Along more orthodox lines, his government sponsored the compilation and publication of Confucian and Neo-Confucian Classics, and it most notably sponsored the preparation in manuscript form of a monumental compendium of literature called Yongle dadian (“The Great Canon of the Yongle Era”) in more than 11,000 volumes, which preserved many works that would otherwise have been lost. But the emperor himself must have considered such activities a kind of busywork for litterateurs who enjoyed public esteem but not his personal trust. A military man of action, the Yongle emperor had little enough patience with unavoidable administrative business, much less with intellectual exercises.
|
b3ce4e7e4b05cdabb35c51964e6ae82a | https://www.britannica.com/biography/Yoshida-Shigeru | Yoshida Shigeru | Yoshida Shigeru
Yoshida Shigeru, (born Sept. 22, 1878, Tokyo—died Oct. 20, 1967, Ōiso, Japan), Japanese political leader who served several terms as prime minister of Japan during most of the critical transition period after World War II, when Allied troops occupied the country and Japan was attempting to build new democratic institutions.
After graduating in law from Tokyo Imperial University in 1906, Yoshida entered the Foreign Ministry. In 1928 he was appointed minister to Sweden, Norway, and Denmark and then vice foreign minister (1928–30). In 1936 the army vetoed his appointment as foreign minister, and he was instead made ambassador to Great Britain, serving until 1939. During World War II his attempts to force an early Japanese surrender led to his arrest in June 1945. He was not freed until the Allied occupation in September of that year, and he then served as foreign minister in the Cabinet of Shidehara Kijūrō, which was formed following the surrender. After the head of the Liberal Party, Hatoyama Ichirō, was prohibited by the Allies from participation in politics, Yoshida assumed the party reins and succeeded to the prime ministership on May 22, 1946.
Although the Socialist leader Katayama Tetsu was able to form a Cabinet in 1947 and 1948, and the leftist Ashida Hitoshi held office for a while in 1948, Yoshida served as prime minister for most of the period between 1946 and 1954, forming five separate cabinets. Having built a large personal following, he was able to rule almost autocratically, giving Japan stability in this critical recovery period. He guided his country back to economic prosperity, setting the course for postwar cooperation with the United States and western Europe. In 1951 he negotiated the peace treaty that ended World War II, as well as a security pact between Japan and the United States.
In 1954 Hatoyama Ichirō, who had been taken off the Allied political purge list in 1951, challenged Yoshida for leadership of the Liberal Party, forcing him out of office. When the two conservative parties merged into the Liberal-Democratic Party under Hatoyama’s leadership in 1955, Yoshida retired from politics.
|
28bdb79979b00158e3f7ba251c31da35 | https://www.britannica.com/biography/Yu-Dafu | Yu Dafu | Yu Dafu
Yu Dafu, Wade-Giles romanization Yü Ta-fu, original name Yu Wen, courtesy name (zi) Dafu, (born December 7, 1896, Fuyang, Zhejiang province, China—died September 17, 1945, Sumatra, Dutch East Indies [now in Indonesia]), popular short-story writer of the 1920s in China, one of the founding members of the Creation Society, which was devoted to the promotion of modern literature.
Yu received his higher education in Japan, where he met other young Chinese writers with whom he founded the Creation Society (Chuangzaoshe) in 1921. His first collection of short stories, Chenlun (1921; “Sinking”), was written in vernacular Chinese, as advocated by the new generation of writers. Chenlun became a popular success in China because of its frank treatment of sex; when Yu returned to his country in 1922, he was a literary celebrity.
Yu continued his work with the Creation Society and edited or contributed to literary journals. He also continued to write short stories, but in 1923, after contracting tuberculosis, he abruptly changed his major theme from one of self-preoccupation to one of concern with the state of the masses. In 1927, following a disagreement with the communist members of the Creation Society, Yu attempted to reorganize the group but was forced to resign.
Yu’s first novel appeared in 1928 and was only moderately successful; his second followed four years later. In 1935 his last and major work of fiction, Chuben (“Flight”), was published. During the Sino-Japanese War (1937–45), Yu wrote anti-Japanese propaganda from Wuhan and Singapore. When Singapore fell to the Japanese in 1942, he fled to Sumatra, only to be executed by Japanese military police there shortly after the end of the war.
Of Yu’s many works the most popular was Rijijiuzhong (1927; “Nine Diaries”), an account of his affair with the young left-wing writer Wang Yingxia; the book broke all previous sales records in China. The critics’ favourite is probably Guoqu (1927; “The Past”), praised for its psychological depth. Yu also wrote essays and classical poetry.
|
31ba636835ca67e739b93faab96ba1e9 | https://www.britannica.com/biography/Yu-Qian | Yu Qian | Yu Qian
Yu Qian, Wade-Giles romanization Yü Ch’ien, (born 1398, Qiantang [now Hangzhou], Zhejiang province, China—died February 1457, Beijing), defense minister who saved China when the Yingzong emperor (reigning as Zhengtong, 1453–49) of the Ming dynasty was captured in 1449 while leading Chinese troops against the Mongol leader Esen Taiji.
With the emperor held hostage and the Mongol armies only 50 miles (80 km) northwest of the capital of Beijing, the government was in a state of panic. Yu Qian acted by placing the Yingzong emperor’s brother, the Jingtai emperor (reigned 1449–57), on the throne and preparing a cannon defense of the city. Soon after Esen attacked, he found his hostage valueless because a new emperor was on the throne, and he saw that the city was well-fortified. Hence, he abandoned the siege within days and retreated into Mongolia. Yu Qian made no efforts to ransom the abducted emperor, but Esen returned the captive in 1450. The Jingtai emperor, however, continued to rule until he fell ill in 1457. The former captive emperor took advantage of his brother’s failing health, returned to the throne (as the Tianshun emperor; reigned 1457–64) with the aid of a group of palace eunuchs, and had Yu Qian executed as a traitor.
|
f94544d76e43cc8e9d7626bb9df9b7e1 | https://www.britannica.com/biography/Yue-Fei | Yue Fei | Yue Fei
Yue Fei, Wade-Giles romanization Yüeh Fei, (born 1103, Tangyin, Henan province, China—died January 27, 1142, Lin’an [now Hangzhou], Zhejiang province), one of China’s greatest generals and national heroes.
In 1126 North China was overrun by the nomadic Juchen (Jin), and the Song capital at Kaifeng was taken. The former emperor Huizong, who had abdicated in 1125, together with his son, the Qinzong emperor (reigned 1125/26–27), was carried into captivity. Another son of Huizong, later known as the Gaozong emperor (reigned 1127–62), reestablished the dynasty in the south, hence its designation as the Nan (Southern) Song (1127–1279).
Retreating southward with Gaozong, Yue Fei assumed command of the Song forces. He prevented the advance of the Juchen by taking advantage of their difficulty in using their cavalry in hilly South China. Assuming the offensive, he was able to recover and secure some of the occupied territory in central China south of the Yangtze and Huai rivers.
However, his attempt to push north and recover all the lost Chinese territory was opposed by a peace party within the capital headed by the minister Qin Hui, who believed that further prosecution of the war would be too costly. Qin Hui’s faction proved more influential, Yue Fei was imprisoned in 1141 and executed early the next year, and a peace treaty was signed that relinquished the northern territory. Yue Fei became revered as a great national hero, and Qin Hui came to be viewed as a traitor. Since the beginning of the 20th century, Yue has been extolled as a champion of national resistance in the face of foreign domination.
|
8b009132a6202e49a8edb28d3d9fd5db | https://www.britannica.com/biography/Yukawa-Hideki | Yukawa Hideki | Yukawa Hideki
Yukawa Hideki , (born January 23, 1907, Tokyo, Japan—died September 8, 1981, Kyōto), Japanese physicist and recipient of the 1949 Nobel Prize for Physics for research on the theory of elementary particles.
Yukawa graduated from Kyōto Imperial University (now Kyōto University) in 1929 and became a lecturer there; in 1933 he moved to Ōsaka Imperial University (now Ōsaka University), where he earned his doctorate in 1938. He rejoined Kyōto Imperial University as a professor of theoretical physics (1939–50), held faculty appointments at the Institute for Advanced Study in Princeton, New Jersey (U.S.), and at Columbia University in New York City, and became director of the Research Institute for Fundamental Physics in Kyōto (1953–70).
In 1935, while a lecturer at Ōsaka Imperial University, Yukawa proposed a new theory of the strong and weak nuclear forces in which he predicted a new type of particle as those forces’ carrier particle. He called it the U-quantum, and it was later known as the meson because its mass was between those of the electron and proton. American physicist Carl Anderson’s discovery in 1937 of a particle among cosmic rays with the mass of the predicted meson suddenly established Yukawa’s fame as the founder of meson theory, which later became an important part of nuclear and high-energy physics. However, by the mid-1940s, it was discovered that Anderson’s new particle, the muon, could not be the predicted carrier particle. The predicted particle, the pion, was not discovered until 1947 by British physicist Cecil Powell, but, despite Yukawa’s successful prediction of the pion’s existence, it also was not the carrier particle of the nuclear forces, and meson theory was supplanted by quantum chromodynamics.
After devoting himself to the development of meson theory, Yukawa started work in 1947 on a more comprehensive theory of elementary particles based on his idea of the so-called nonlocal field.
|
afbeb3b0fd5fd1341062c37abe103875 | https://www.britannica.com/biography/Yukio-Mishima | Mishima Yukio | Mishima Yukio
Mishima Yukio, pseudonym of Hiraoka Kimitake, (born January 14, 1925, Tokyo, Japan—died November 25, 1970, Tokyo), prolific writer who is regarded by many critics as the most important Japanese novelist of the 20th century.
Mishima was the son of a high civil servant and attended the aristocratic Peers School in Tokyo. During World War II, having failed to qualify physically for military service, he worked in a Tokyo factory, and after the war he studied law at the University of Tokyo. In 1948–49 he worked in the banking division of the Japanese Ministry of Finance. His first novel, Kamen no kokuhaku (1949; Confessions of a Mask), is a partly autobiographical work that describes with exceptional stylistic brilliance a homosexual who must mask his sexual preferences from the society around him. The novel gained Mishima immediate acclaim, and he began to devote his full energies to writing.
He followed up his initial success with several novels whose main characters are tormented by various physical or psychological problems or who are obsessed with unattainable ideals that make everyday happiness impossible for them. Among these works are Ai no kawaki (1950; Thirst for Love), Kinjiki (1954; Forbidden Colours), and Shiosai (1954; The Sound of Waves). Kinkaku-ji (1956; The Temple of the Golden Pavilion) is the story of a troubled young acolyte at a Buddhist temple who burns down the famous building because he himself cannot attain to its beauty. Utage no ato (1960; After the Banquet) explores the twin themes of middle-aged love and corruption in Japanese politics. In addition to novels, short stories, and essays, Mishima also wrote plays in the form of the Japanese Nō drama, producing reworked and modernized versions of the traditional stories. His plays include Sado kōshaku fujin (1965; Madame de Sade) and Kindai nōgaku shu (1956; Five Modern Nōh Plays).
Mishima’s last work, Hōjō no umi (1965–70; The Sea of Fertility), is a four-volume epic that is regarded by many as his most lasting achievement. Its four separate novels—Haru no yuki (Spring Snow), Homma (Runaway Horses), Akatsuki no tera (The Temple of Dawn), and Tennin gosui (The Decay of the Angel)—are set in Japan and cover the period from about 1912 to the 1960s. Each of them depicts a different reincarnation of the same being: as a young aristocrat in 1912, as a political fanatic in the 1930s, as a Thai princess before and after World War II, and as an evil young orphan in the 1960s. These books effectively communicate Mishima’s own increasing obsession with blood, death, and suicide, his interest in self-destructive personalities, and his rejection of the sterility of modern life.
Mishima’s novels are typically Japanese in their sensuous and imaginative appreciation of natural detail, but their solid and competent plots, their probing psychological analysis, and a certain understated humour helped make them widely read in other countries.
The short story “Yukoku” (“Patriotism”) from the collection Death in Midsummer, and Other Stories (1966) revealed Mishima’s own political views and proved prophetic of his own end. The story describes, with obvious admiration, a young army officer who commits seppuku, or ritual disembowelment, to demonstrate his loyalty to the Japanese emperor. Mishima was deeply attracted to the austere patriotism and martial spirit of Japan’s past, which he contrasted unfavourably to the materialistic Westernized people and the prosperous society of Japan in the postwar era. Mishima himself was torn between these differing values. Although he maintained an essentially Western lifestyle in his private life and had a vast knowledge of Western culture, he raged against Japan’s imitation of the West. He diligently developed the age-old Japanese arts of karate and kendo and formed a controversial private army of about 80 students, the Tate no Kai (Shield Society), with the aim of preserving the Japanese martial spirit and helping to protect the emperor (the symbol of Japanese culture) in case of an uprising by the left or a communist attack.
On November 25, 1970, after having that day delivered the final installment of The Sea of Fertility to his publisher, Mishima and four Shield Society followers seized control of the commanding general’s office at a military headquarters near downtown Tokyo. He gave a 10-minute speech from a balcony to a thousand assembled servicemen in which he urged them to overthrow Japan’s post-World War II constitution, which forbids war and Japanese rearmament. The soldiers’ response was unsympathetic, and Mishima then committed seppuku in the traditional manner, disemboweling himself with his sword, followed by decapitation at the hands of a follower. This shocking event aroused much speculation as to Mishima’s motives as well as regret that his death had robbed the world of such a gifted writer.
|
db36f0cd1d971d0f1d59b5bbe64c390e | https://www.britannica.com/biography/Yukiya-Amano | Yukiya Amano | Yukiya Amano
Yukiya Amano, Japanese Amano Yukiya, (born May 9, 1947, Kanagawa, Japan—died July 18, 2019), Japanese expert in nuclear disarmament and nonproliferation who was director general (2009–19) of the International Atomic Energy Agency (IAEA).
Amano joined Japan’s Foreign Ministry after graduating from Tokyo University’s law faculty in 1972. In 1988 he was appointed director for research coordination and senior research fellow of the Japan Institute of International Affairs. He was subsequently appointed director (1990) of the publications and information centre of the Organisation for Economic Co-operation and Development in Tokyo and director (1993) of the nuclear energy division of the Japanese Foreign Ministry. As Amano’s expertise on international issues concerning nuclear weapons grew, he participated in arms-control talks that led to the 1995 extension of the Treaty on the Non-proliferation of Nuclear Weapons, signed by 174 nations and designed to prevent the spread of nuclear weapons beyond countries that already possessed them; and the 1996 Comprehensive Nuclear-Test-Ban Treaty, a worldwide effort to end all nuclear testing. He then held positions in Japan’s Foreign Ministry as director general (2002) for arms control and scientific affairs and director general (2004) of the disarmament, nonproliferation, and science department. In 2005–09 he served as Japan’s envoy to and a member of the board of governors of the IAEA, an autonomous intergovernmental organization charged with safeguarding against nuclear proliferation and encouraging global cooperation in nuclear applications, energy, science, and technology.
In July 2009 Amano was elected the director general of the IAEA. In its role as a promoter of nuclear peace and nonproliferation, the organization had recently gained prominence with concerns that Iran might be developing nuclear weapons. Successive IAEA reports in 2009, 2010, and 2011 heightened those concerns, and finding a diplomatic solution to the issue of Iran’s nuclear program became a significant part of Amano’s role as head of the IAEA. A new round of talks with Iran in 2012 offered the hope of an end to Iran’s refusal to allow nuclear inspectors to visit secret military facilities, but the talks ended without an agreement.
In addition to nonproliferation efforts, the IAEA under Amano worked toward the advancement of nuclear safety. Following the Fukushima Daiichi nuclear accident in March 2011, Amano convened an IAEA Ministerial Conference on Nuclear Safety. The conference led to the creation of the agency’s first-ever action plan for nuclear safety, adopted by the IAEA Board of Governors in September 2011.
|
27bef103d52cf2bcddd127d78cb7ecf5 | https://www.britannica.com/biography/Yuri-Nikolayevich-Grigorovich | Yuri Grigorovich | Yuri Grigorovich
Yuri Grigorovich, in full Yuri Nikolayevich Grigorovich, (born January 2, 1927, Leningrad [now St. Petersburg], Russia, Soviet Union), Russian dancer and choreographer who was artistic director of the Bolshoi Ballet from 1964 to 1995.
Grigorovich graduated from the Leningrad Choreographic School in 1946 and joined the Kirov (now Mariinsky) Ballet, specializing in demi-caractère roles. He is best known, however, as a choreographer. The Stone Flower (1957) was one of his earliest successes at the Kirov, and two years later he remounted it for the Bolshoi Ballet in Moscow. In 1962 Grigorovich became the Kirov’s ballet master; two years later he was appointed chief choreographer and artistic director of the Bolshoi. Grigorovich’s productions at the Bolshoi included The Sleeping Beauty (1965), The Nutcracker (1966), Spartacus (1968), Swan Lake (1969), Ivan the Terrible (1975), and Angara (1976).
Grigorovich was named People’s Artist of the Russian S.F.S.R. (1966), and he received the Lenin Prize (1970) and the State Prize (1977). He was also the editor in chief of the Encyclopedia of Ballet. In 1995 Grigorovich was forced to resign his post with the Bolshoi amidst charges that he had allowed the company to become artistically stagnant during the last decade of his long tenure. However, in 2008 he returned to the Bolshoi, serving as a choreographer.
|
90a712a7f2484ac37baeff778927c790 | https://www.britannica.com/biography/Yuriy-Sedykh | Yuriy Sedykh | Yuriy Sedykh
Yuriy Sedykh, Yuriy also spelled Yuri, (born June 11, 1955, Novocherkassk, Russia, U.S.S.R.), Russian athlete who is considered the greatest hammer thrower of modern times. He set six world records and won two Olympic gold medals.
Sedykh began competing in the hammer throw in 1968. In 1972 Anatoly Bondarchuk, who had won a hammer throw gold medal in that year’s Munich Olympics, became Sedykh’s coach. The next year, Sedykh won the European junior championship, and at the 1976 Olympics in Montreal he won his first gold medal with a throw of 77.52 metres (254 feet 4 inches), while Bondarchuk won a bronze medal. Sedykh’s great rivalry with Sergey Litvinov began in 1980; at the Moscow Olympics, Sedykh set a new world record of 81.80 metres (268 feet 4 inches) in his first throw of the final round, beating out Litvinov and Juri Tamm. It was the second consecutive Olympics in which Soviets won all the hammer throw medals.
Sedykh was a master of the three-turn technique, keeping his arms straight as he turned with great speed in the circle, and he wrote a thesis on powerbuilding in hammer throw training. He won European championships in 1978 and 1982. Litvinov set world records in 1982 and 1983, which were superseded by Sedykh’s three world records in 1986, including a throw of 86.74 metres (284 feet 6 inches). Also in 1986 Sedykh defeated Litvinov soundly to win a third European championship. Sedykh continued to compete until 1995, and he subsequently became a coach.
|
11bd59d7734003bda42fdda6dcf288e8 | https://www.britannica.com/biography/Yury-Alexandrovich-Zavadsky | Yury Alexandrovich Zavadsky | Yury Alexandrovich Zavadsky
Yury Alexandrovich Zavadsky, (born June 30, 1894, Moscow, Russia—died April 5, 1977, Moscow), Soviet actor, director, and teacher whose eclectic vision ranged from foreign classics to modern heroic drama.
Zavadsky made his acting debut while studying with Eugene Vakhtangov, at whose theatre he played Anthony in Maurice Maeterlinck’s The Miracle of St. Anthony (1915). He continued with Vakhtangov and was a principal in his final and most acclaimed production, Turandot (1922). Zavadsky made his directorial debut with Nikolay Gogol’s The Marriage (1924), and the conscious theatricality of his staging demonstrated his debt to his teacher. He worked with the Moscow Art Theatre (1924–31) and became head of the Central Theatre of the Red Army (1932). While at the Central Theatre, Zavadsky began to meld the avant-garde lessons of Vakhtangov with the precepts of Konstantin Stanislavsky; his productions of patriotic dramas, such as Aleksandr Korniychuk’s The Destruction of the Squadron, revealed a new emphasis on clarity of form and ensemble acting.
After directing the Gorky Theatre in Rostov from 1936 to 1940, Zavadsky returned to Moscow to begin teaching at the State Institute of Theatre Arts and to become chief director of the Mossovet Theatre. He joined the Communist Party in 1944 and was made a full professor at the State Institute in 1947. He continued a series of foreign classics at the Mossovet, including The Merry Wives of Windsor (1957), and he produced works on patriotic themes, such as A. Surov’s Dawn Over Moscow (1950). He revived plays by 19th-century Russian playwrights; his Masquerade by Mikhail Lermontov won him a Lenin Prize (1965). In all his later productions, special music elaborately employed and meticulous ensemble acting were hallmarks.
|
38d7213fa56b39af011862bb687b5ed2 | https://www.britannica.com/biography/Yury-Andropov | Yury Andropov | Yury Andropov
Yury Andropov, in full Yury Vladimirovich Andropov, (born June 15 [June 2, Old Style], 1914, Nagutskoye, Russia—died February 9, 1984, Moscow, Russia, U.S.S.R.), head of the Soviet Union’s KGB (State Security Committee) from 1967 to 1982 and his country’s leader as general secretary of the Communist Party’s Central Committee from November 1982 until his death 15 months later.
The son of a railway worker, Andropov was a telegraph operator, film projectionist, and boatman on the Volga River before attending a technical college and, later, Petrozavodsk University. He became an organizer for the Young Communist League (Komsomol) in the Yaroslav region and joined the Communist Party in 1939. His superiors noticed his abilities, and he was made head of the Komsomol in the newly created Karelo-Finnish Autonomous Republic (1940–44).
The turning point in Andropov’s career was his transfer to Moscow (1951), where he was assigned to the party’s Secretariat staff, considered a training ground for promising young officials. As ambassador to Hungary (July 1954–March 1957), he played a major role in coordinating the Soviet invasion of that country. Andropov then returned to Moscow, rising rapidly through the Communist hierarchy and, in 1967, becoming head of the KGB. Andropov’s policies as head of the KGB were repressive; his tenure was noted for its suppression of political dissidents.
Andropov was elected to the Politburo, and, as Soviet leader Leonid Brezhnev’s health declined, Andropov began to position himself for succession, resigning his KGB post in 1982. Andropov was chosen by the Communist Party Central Committee to succeed Brezhnev as general secretary on November 12, scarcely two days after Brezhnev’s death. He consolidated his power by becoming chairman of the Presidium of the Supreme Soviet (president) on June 16, 1983.
Ill health overtook him by August 1983, and thereafter he was never seen again in public. He accomplished little and was succeeded by a former rival, Konstantin Chernenko.
|
292ad76667a8a8a3537b8904b87817dc | https://www.britannica.com/biography/Yury-Luzhkov | Yury Luzhkov | Yury Luzhkov
Yury Luzhkov, in full Yury Mikhaylovich Luzhkov, (born September 21, 1936, Moscow, Russia, U.S.S.R.—died December 10, 2019, Munich, Germany), Russian politician who served as mayor of Moscow (1992–2010). As mayor, he transformed Moscow into the engine of post-Soviet state capitalism.
Luzhkov studied mechanical engineering at the Gubkin Academy of Oil and Gas in Moscow. After graduating in 1958, he was a junior scientist at the Research and Development Institute of Plastics. He subsequently worked at various positions of increasing stature in the chemical industry, and by 1986 he was head of the science and technology department of the Chemical Industry Ministry in Moscow. In 1987 he became first deputy chairman of the Moscow government. Three years later, he rose to the position of executive committee leader under Mayor Gavril Popov, and he became deputy mayor when Popov was reelected in 1991. Popov’s resignation in June 1992 prompted Russian Pres. Boris Yeltsin to name Luzhkov the new mayor.
Popular and powerful, Luzhkov was the quintessential khozyain (“boss”), a strong-willed, at times bullying, leader who had harnessed his loyal team to the single goal of remaking the city of Moscow. Through careful manipulation of post-Soviet privatization, the city owned about 1,500 businesses outright and had a financial stake in some 300 more. Luzhkov took a personal interest in these enterprises, from regular visits of construction sites to approving the menu and logo of Russkoye Bistro, a fast-food chain created to compete with McDonald’s. Though cognizant of the influence of organized crime in some new businesses, his administration was untainted by any major scandals. In 1994 Luzhkov persuaded Yeltsin to give him control over the city’s vast inventory of state holdings, and in 1996 Moscow took in $1 billion in privatization revenues.
Often appearing in public in an open collar and peaked leather cap, Luzhkov affected a populist stance in his public battles with the Kremlin. Although he had backed Yeltsin in times of crisis—the coup attempt of August 1991, the parliamentary revolt of October 1993, and the presidential elections of June and July 1996—Luzhkov was often critical of the president and his young reform-minded advisers, particularly First Deputy Prime Minister Anatoly Chubais. Luzhkov frequently squared off against Chubais over the handling of the privatization process in Moscow. Outlying provinces also harboured suspicions of the mayor and his city’s newfound wealth, but Luzhkov was praised by his constituents, nearly 90 percent of whom reelected him over a communist challenger in June 1996.
By the late 1990s, having overseen a wave of entrepreneurialism and a building boom that had pushed office rents higher than those of New York City, Luzhkov had transformed Moscow into the engine of post-Soviet state capitalism. In September 1997 he hosted a lavish birthday party for his native city. The three-day extravaganza, which cost at least $60 million, was intended not only to celebrate Moscow’s rich 850-year history but also to show the world that the Russian capital, already home to two-thirds of the country’s foreign investment, was eager to maintain its rapid pace of development.
In 1998 Luzhkov started the Fatherland political party to serve as a platform for the 2000 presidential election. When he was not endorsed as the party’s presidential nominee, he ran for reelection as mayor of Moscow; he was reelected in 1999 and again in 2003. From 2003 he served as a co-chairman of United Russia, a party formed by Fatherland and other groups.
As a strong proponent of Russian nationalism, Luzhkov directed a significant part of the city’s budget toward the support of Russian separatists in Moldova and the Russian military in Ukraine as well as the building of new housing in Russian enclaves in Georgia. Luzhkov also had particularly outspoken views on homosexuality: he banned the city’s first planned gay pride parade in 2006 and later forbade other gay rights events in Moscow.
Meanwhile, under Luzhkov’s tenure, Moscow continued on a path of unprecedented growth. A thermal power station and a waste-processing plant opened in the city, new hotels and office complexes were constructed, and many of the city’s historic buildings were renovated. In 2007 Luzhkov was appointed to a fifth mayoral term by Pres. Vladimir Putin, who in 2004 had initiated a bill that gave him the power to appoint regional leaders. However, Luzhkov reportedly angered Putin’s successor, Dmitry Medvedev, by publicly criticizing his performance as president. After Luzhkov refused to resign, Medvedev dismissed the long-standing mayor in September 2010.
|
743e7339ae9978627f809b30805e359e | https://www.britannica.com/biography/Yury-of-Moscow | Yury of Moscow | Yury of Moscow
Yury of Moscow, however, gained the support of Öz Beg (Uzbek), khan (1313–41) of the Golden Horde, and in 1317 replaced Michael as grand prince. Michael refused to accept his loss and defeated the military force sent by Öz Beg and Yury to dethrone him.…
|
7119fe63d68e4b3c366a327c182574df | https://www.britannica.com/biography/Yusuf-Asar-Yathar | Yūsuf Asʾar Yathʾar | Yūsuf Asʾar Yathʾar
About 523 ce Yūsuf Asʾar Yathʾar (nicknamed Dhū Nuwās by the Muslim tradition), a Ḥimyarite king of Jewish faith, persecuted and killed numerous miaphysite Christians in Najrān on the northern frontier of Yemen. He also killed Byzantine merchants elsewhere in his kingdom. Outraged by the massacre and pressed…
2nd century ce), Dhu Nuwas, proclaimed himself a Jew and finally suffered defeat in approximately 525 as a consequence of Christian influence on the Abyssinian armies. Jewish missionaries, however, continued to compete with Christian missionaries and thus helped to lay the groundwork for the birth of an indigenous…
…a convert to Judaism) named Yūsuf Asʾar Yathʾar. It seems that the conflict escalated from what had been (in one account) a trade dispute. Yūsuf massacred the entire Ethiopian population of the port of Mocha and of Ẓafār and, about a year later, the Christians of Najrān. Aksum retaliated with…
…Ḥimyarite king, Dhū Nuwās (Yūsuf Ashʿar; c. 6th century ce), was a convert to Judaism who carried out a major massacre of the Christian population of Yemen. The survivors called for aid from the Byzantine emperor, who arranged to have an army from the Christian kingdom of Aksum (in…
|
57b80c3322f741b6e73685c944b6396b | https://www.britannica.com/biography/Yves-Allegret | Yves Allégret | Yves Allégret
Yves Allégret, (born Oct. 13, 1907, Paris, France—died Jan. 31, 1987, Paris), French motion-picture director who gained fame for his work in the “film noir” genre that was popular in the late 1940s.
Allégret began his film career working as an assistant to his older brother, the director Marc Allégret, and for Augusto Genina and Jean Renoir. Entering films during the 1930s and working with directors involved in the avant-garde in France during that period, Allégret was influenced by the impressionist and surrealist ideas that these directors expressed in their films.
Although Allégret created several early short films and commercials, he did not direct his first feature film until 1941. His best films, many of which starred Simone Signoret, included Les Deux Timides (1942; “The Two Timid Ones”), Dédée d’Anvers (1947; Dedee), Une si jolie petite plage (1948; Such a Pretty Little Beach, or Riptide), Manèges (1949; The Cheat), Les Orgueilleux (1953; The Proud and the Beautiful), Oasis (1954), Germinal (1963), Johnny Banco (1967), L’Invasion (1970), Orzowei (1975), and Mords pas—on t’aime (1976; Don’t Bite—We Love You).
|
218204d7186354c425d63f360dd359da | https://www.britannica.com/biography/Yves-Delage | Yves Delage | Yves Delage
Yves Delage, (born May 13, 1854, Avignon, Fr.—died Oct. 7, 1920, Sceaux), French zoologist known for his research and elucidation of invertebrate physiology and anatomy. He also discovered the equilibrium-stabilizing function of the semicircular canals in the inner ear (1886).
Delage became a member of the zoology staff at the Sorbonne in 1880 and at Caen, Fr., in 1881; he became director of the zoological station at Luc in 1884, titular professor of zoology at the Sorbonne in 1886, and director of the marine zoological station at Roscoff in 1878 and 1902.
Delage studied circulation in crustaceans, made important discoveries in the embryology of sponges (such as Sacculina), and investigated the nervous system of barnacles (Peltogaster) and flatworms (Convoluta). He developed a method for culturing sea urchin eggs following artificial fertilization by chemical means. Turning late in his career to more general problems of biology, he considered how life in individual organisms and species is manifested through cytoplasm, and he examined mechanical problems of the cell. He also became a strong proponent in France of the neo-Lamarckian view of heredity and evolution. His writings include La Structure du protoplasma, les théories sur l’hérédité et les grands problèmes de la biologie générale (1895; “The Structure of Protoplasm, the Theories of Heredity and the Great Problems of General Biology”), Traité de zoologie concrète, 6 vol. (1896–1903; “Treatise of Pure Zoology”), and Les Théories de l’évolution (1909; “The Theories of Evolution”) with Marie Goldsmith.
|
2f914e95c069b5013da455262d6f0aed | https://www.britannica.com/biography/Yves-Klein | Yves Klein | Yves Klein
Yves Klein, (born April 28, 1928, Nice, France—died June 6, 1962, Paris), French artist associated with the Parisian Nouveau Réalisme movement championed by the French critic Pierre Restany. The only painter in the founding group, Klein was a highly influential artist whose radical techniques and conceptual gestures laid the groundwork for much of the art of the 1960s and ’70s. His media were pure pigments, gold leaf, fire, water, live nude models (his “living brushes”), actions, and events.
Although Klein had no formal training in art, both his parents were artists, so he early on understood the power of the imagination as made manifest through idea, form, and particularly colour. In his early 20s Klein began his study of Rosicrucianism, a set of esoteric spiritual teachings, which would play a key role in his evolving mystical beliefs. In 1955 Klein settled in Paris after a stay in London and travels to Ireland, Spain, and Japan. While in Japan, Klein studied judo, achieving the black belt (master) level. He taught classes in that system of unarmed combat for several years.
During just a few years in Paris, Klein developed an extraordinary range of avant-garde work. He rejected the linear and reconceived form as “a value of impregnation,” the filling of space with “the pictorial immaterial sensibility.” To demonstrate this philosophy, expressed in several manifestos, he made monochrome paintings of evenly dispersed pure pigment. He also displayed the sponges he used to make the paintings as richly coloured works in themselves. During this period he worked chiefly in monochromes of three colours—gold, which he equated with physical material transformed to the spiritual; red, which he called “monopink” and equated with flesh-and-blood materiality; and ultramarine, which represented space—but blue dominated, and in 1960 he patented International Klein Blue, called IKB. In 1958, as part of a live performance, Klein choreographed female models who applied his paint to their bodies and then pressed their painted bodies on canvas or paper spread on the wall and on the floor. These “living brush” paintings, which left a distinct figural impression, were followed by his Anthropométries series, which employed the models in a variety of motions and left the canvas with arrays of gestural impressions. On March 9, 1960, Klein conducted a 20-minute performance of his Monotone Symphony while his models “painted” new pieces of art.
Klein ventured into other types of conceptual art as well. For The Void (1957) he emptied the Galerie Iris Clert in Paris, repainted its white walls white, and presented the empty space as a work of art. For Leap into the Void (1960) he staged a photograph showing the artist leaping, arms spread, from a building. Capturing the artist suspended in space, the photograph appears to show him levitating by his own spiritual power. Klein died at age 34, but the variety of work he produced in his brief life and his many manifestos made him one of the groundbreaking conceptual artists of the 20th century.
|
00c89f46c0cf7d0df1be28a89b65db4c | https://www.britannica.com/biography/Yvonne-Rainer | Yvonne Rainer | Yvonne Rainer
Yvonne Rainer, (born November 24, 1934, San Francisco, California, U.S.), American avant-garde choreographer and filmmaker whose work in both disciplines often featured the medium’s most fundamental elements rather than meeting conventional expectations.
Rainer moved to New York City in 1957 to study theatre. She found herself more strongly drawn to modern dance than acting, however, and began studying at the Martha Graham School and later with Merce Cunningham. Rainer was one of the organizers of the Judson Dance Theater, a focal point for vanguard activity in the dance world throughout the 1960s, and she formed her own company for a brief time after the Judson performances ended. Rainer was noted for an approach to dance that treated the body more as the source of an infinite variety of movements than as the purveyor of emotion or drama. Many of the elements she employed in the early 1970s—such as repetition, patterning, tasks, and games—later became standard features of modern dance.
Her best-known dance, “Trio A,” (1966) a section of a larger work called The Mind Is a Muscle (1966–68), consisted of a simultaneous performance by three dancers that included a difficult series of circular and spiral movements. It was widely adapted and interpreted by other choreographers. Rainer choreographed more than 40 concert works, including Terrain (1963).
Rainer sometimes included filmed sequences in her dances, and in the mid-1970s she began to turn her attention to film directing. Her early films do not follow narrative conventions, instead combining reality and fiction, sound and visuals, to address social and political issues. Rainer directed several experimental films about dance and performance, including Lives of Performers (1972), Film About a Woman Who… (1974), and Kristina Taking Pictures (1976). Her later films included The Man Who Envied Women (1985), Privilege (1990), and MURDER and murder (1996). The last-mentioned work, more conventional in its narrative structure, is a lesbian love story as well as a reflection on urban life and on breast cancer, and it features Rainer herself. Her film work received several awards, and in 1990 she was a recipient of a MacArthur Foundation award.
In 2000 Rainer resumed her career as a choreographer, and her subsequent dances included Spiraling Down (2008), Assisted Living: Do You Have Any Money? (2013), and The Concept of Dust, or How do you look when there’s nothing left to move? (2014).
|
44f7daa1f4144abb65f28f316a437612 | https://www.britannica.com/biography/Z-S-Bongela | Z. S. Bongela | Z. S. Bongela
In K.S. Bongela’s Alitshoni lingenandaba (1971; “The Sun Does Not Set Without News”), the reader is led to a revelation of the corruption that results when traditional ties are broken. Christianity and urban corruption are at the centre of Witness K. Tamsanqa’s Inzala kaMlungisi (1954; “The…
|
7cf984d60b339d05b7a877345d1f8396 | https://www.britannica.com/biography/Zacharias-Topelius | Zacharias Topelius | Zacharias Topelius
Zacharias Topelius, (born Jan. 14, 1818, Kuddnäs, Russian Finland—died March 12, 1898, Helsinki), the father of the Finnish historical novel. His works, written in Swedish, are classics of Finland’s national literature.
Topelius joined the faculty of the University of Helsinki as professor of Finnish history in 1864; he served as university president, 1875–78. Though he published five collections of lyrics, he is best known for Fältskärns berättelser (1853–67; The King’s Ring and the Surgeon’s Stories, 1872), a romanticized account of Swedish–Finnish history during the 17th and 18th centuries. In later years he wrote stories based on Finnish folktales and fairy tales for children. All his works have been translated into Finnish.
|
5478f76e9b6861e10a1bd255c10bd545 | https://www.britannica.com/biography/Zaha-Hadid | Zaha Hadid | Zaha Hadid
Zaha Hadid, in full Dame Zaha Hadid, (born October 31, 1950, Baghdad, Iraq—died March 31, 2016, Miami, Florida, U.S.), Iraqi-born British architect known for her radical deconstructivist designs. In 2004 she became the first woman to be awarded the Pritzker Architecture Prize.
Zaha Hadid was born on October 31, 1950, in Baghdad, and she died on March 31, 2016, in Miami, at the age of 65.
Zaha Hadid was an architect known for her radical deconstructivist designs. She was the first woman to be awarded the Pritzker Architecture Prize, in 2004. Her buildings included the Heydar Aliyev Centre Baku, Azerbaijan, and the MAXXI museum of contemporary art and architecture in Rome.
Zaha Hadid attended a Catholic school and later an English boarding school. She received a bachelor’s degree in mathematics from the American University in Beirut. In 1972 she began studies at the Architectural Association, a major centre of progressive architectural thought in London.
In 1983 Hadid gained international recognition with her unconventional painted entry for The Peak, a recreational centre in Hong Kong. After the success of the Lois & Richard Rosenthal Center for Contemporary Art in Cincinnati, Ohio (2003), one of her first built works, Hadid gained larger commissions and created more daring structures.
Zaha Hadid was born to an upper-middle-class family. Her father, Mohammed, was a politician, and her mother, Wajiha Sabunji, practiced art. Zaha had two elder brothers, Haytham and Foulath, the latter a noted academic. Zaha never married or had children, but she had several nephews and nieces, including Rana, an architect.
Hadid began her studies at the American University in Beirut, Lebanon, receiving a bachelor’s degree in mathematics. In 1972 she traveled to London to study at the Architectural Association, a major centre of progressive architectural thought during the 1970s. There she met the architects Elia Zenghelis and Rem Koolhaas, with whom she would collaborate as a partner at the Office of Metropolitan Architecture. Hadid established her own London-based firm, Zaha Hadid Architects (ZHA), in 1979.
In 1983 Hadid gained international recognition with her competition-winning entry for The Peak, a leisure and recreational centre in Hong Kong. This design, a “horizontal skyscraper” that moved at a dynamic diagonal down the hillside site, established her aesthetic: inspired by Kazimir Malevich and the Suprematists, her aggressive geometric designs are characterized by a sense of fragmentation, instability, and movement. This fragmented style led her to be grouped with architects known as “deconstructivists,” a classification made popular by the 1988 landmark exhibition “Deconstructivist Architecture” held at the Museum of Modern Art in New York City.
Hadid’s design for The Peak was never realized, nor were most of her other radical designs in the 1980s and early ’90s, including the Kurfürstendamm (1986) in Berlin, the Düsseldorf Art and Media Centre (1992–93), and the Cardiff Bay Opera House (1994) in Wales. Hadid began to be known as a “paper architect,” meaning her designs were too avant-garde to move beyond the sketch phase and actually be built. This impression of her was heightened when her beautifully rendered designs—often in the form of exquisitely detailed coloured paintings—were exhibited as works of art in major museums.
Hadid’s first major built project was the Vitra Fire Station (1989–93) in Weil am Rhein, Germany. Composed of a series of sharply angled planes, the structure resembles a bird in flight. Her other built works from this period included a housing project for IBA Housing (1989–93) in Berlin, the Mind Zone exhibition space (1999) at the Millennium Dome in Greenwich, London, and the Land Formation One exhibition space (1997–99) in Weil am Rhein. In all these projects, Hadid further explored her interest in creating interconnecting spaces and a dynamic sculptural form of architecture.
Hadid solidified her reputation as an architect of built works in 2000, when work began on her design for a new Lois & Richard Rosenthal Center for Contemporary Art in Cincinnati, Ohio. The 85,000-square-foot (7,900-square-metre) centre, which opened in 2003, was the first American museum designed by a woman. Essentially a vertical series of cubes and voids, the museum is located in the middle of Cincinnati’s downtown area. The side that faces the street has a translucent glass facade that invites passersby to look in on the workings of the museum and thereby contradicts the notion of the museum as an uninviting or remote space. The building’s plan gently curves upward after the visitor enters the building; Hadid said she hoped this would create an “urban carpet” that welcomes people into the museum.
In 2010 Hadid’s boldly imaginative design for the MAXXI museum of contemporary art and architecture in Rome earned her the Royal Institute of British Architects (RIBA) Stirling Prize for the best building by a British architect completed in the past year. She won a second Stirling Prize the following year for a sleek structure she conceived for Evelyn Grace Academy, a secondary school in London. Hadid’s fluid undulating design for the Heydar Aliyev Center, a cultural centre that opened in 2012 in Baku, Azerbaijan, won the London Design Museum’s Design of the Year in 2014. She was the first woman to earn that award—which judges designs in architecture, furniture, fashion, graphics, product, and transportation—and the design was the first from the architecture category. Her other notable works included the London Aquatics Centre built for the 2012 Olympics; the Eli and Edythe Broad Art Museum, which opened in 2012 at Michigan State University in East Lansing, Michigan; and the Jockey Club Innovation Tower (2014) for the Hong Kong Polytechnic University.
Hadid’s extraordinary accomplishments were all the more remarkable considering she was working in an industry largely dominated by men. Her supporters contended that she was often subjected to controversies that her male counterparts were not. Her fantastic forms were often derided, and the expense and scale of many of her commissions were frequently ridiculed. Indeed, the problematic site for the London Aquatics Centre forced Hadid to scale back her design, while mounting protests, notably from preeminent Japanese architects, led her to scrap her plan altogether for the New National Stadium for the 2020 Olympics in Tokyo (the Olympics were later postponed because of the coronavirus pandemic). Further controversy followed after a 2014 report disclosed that some 1,000 foreign workers had died because of poor working conditions across construction sites in Qatar, where her Al Wakrah Stadium for the 2022 World Cup was set to break ground. When asked about the deaths, Hadid objected to her responsibility as an architect to ensure safe working conditions, and her remarks were widely regarded as insensitive. An architecture critic of The New York Review of Books exacerbated the situation when he falsely claimed that 1,000 had died building her stadium, which had yet to break ground. Hadid filed a defamation lawsuit against the critic and publication. She later settled, accepting an apology and donating the undisclosed sum to a charity protecting labour rights.
Hadid taught architecture at many places, including the Architectural Association, Harvard University, the University of Chicago, and Yale University. She also designed furniture, jewelry, footwear, bags, interior spaces such as restaurants, and stage sets, notably for the 2014 Los Angeles Philharmonic production of Wolfgang Amadeus Mozart’s Così fan tutte.
At her sudden death from a heart attack while being treated for bronchitis in 2016, Hadid left 36 unfinished projects, including the 2022 World Cup stadium, the Antwerp Port House (2016), and the King Abdullah Petroleum Studies and Research Center (2017; KAPSARC) in Riyadh, Saudi Arabia. Her business partner, Patrik Schumacher, assumed leadership of her firm, assuring the completion of existing commissions and the procurement of new ones.
In addition to the Pritzker Prize and the Stirling Prize, her numerous awards included the Japan Art Association’s Praemium Imperiale prize for architecture (2009) and the Royal Gold Medal for Architecture (2016), RIBA’s highest honour. Hadid was a member of the Encyclopædia Britannica Editorial Board of Advisors (2005–06). In 2012 she was made a Dame Commander of the Order of the British Empire (DBE).
|
9b461d06fbe5d6f7a310d069bd645e71 | https://www.britannica.com/biography/Zahi-Hawass | Zahi Hawass | Zahi Hawass
Zahi Hawass, (born May 28, 1947, Al-ʿUbaydiyyah, Egypt), Egyptian archaeologist and public official, whose magnetic personality and forceful advocacy helped raise awareness of the excavation and preservation efforts he oversaw as head of Egypt’s Supreme Council of Antiquities (SCA). He served as Egypt’s minister of antiquities in 2011.
Hawass grew up near Damietta, Egypt, and entered Alexandria University with the intention of becoming a lawyer. He eventually changed his course of study to Greek and Roman archaeology, but it was not until after graduation (B.A., 1967) that he developed a passion for the subject, while working as an inspector for the Department of Antiquities (the forerunner of the SCA). Following a one-year postgraduate course in Egyptology at Cairo University, Hawass won a Fulbright fellowship and enrolled in a Ph.D. program in Egyptology at the University of Pennsylvania, from which he graduated in 1987. He then returned to Egypt, where he was named general director of antiquities for the Giza pyramids complex as well as for the historical sites at Ṣaqqārah and Al-Wāḥāt al-Baḥriyyah (Bahariya Oasis).
At Giza in 1990, Hawass discovered a necropolis that housed the tombs of the pyramid builders, which proved, contrary to then-popular fringe theories, that the pyramids were indeed erected by Egyptians. Hawass’s frequent outspoken denunciations of the alternative theorists, whom he termed “pyramidiots,” established his international reputation. His profile was further raised in the late 1990s when he began the excavation of an extensive collection of tombs at Bahariya Oasis. The site became known as the Valley of the Golden Mummies after the tombs’ well-preserved denizens, the most that had ever been found at a single site.
By the time he was appointed secretary-general of the SCA in 2002, Hawass had appeared on numerous American television programs promoting Egypt’s archaeological heritage. His ubiquitous media presence made him one of the most recognizable figures in Egypt but also one of the most divisive. Critics noted his tendency toward glib self-aggrandizement, which minimized the accomplishments of other antiquities workers, and charged that he too often privileged public relations over science. (Few of his scientific findings were published in peer-reviewed journals.) At the same time, Hawass was lauded for reclaiming Egyptology—for decades the province of Western scholars—for Egyptians. His zealous promotional efforts were seen to have engendered national pride and to have helped attract tourism.
As head of the SCA, Hawass directed several other excavation projects that led to significant findings, including the discovery in 2008 of an Old Kingdom pyramid at Ṣaqqārah that was determined to belong to a queen of Teti. He also initiated the Egyptian Mummy Project, which used modern forensic techniques such as CAT scans to study both royal and nonroyal mummies. As a result of that project, in 2007 Hawass announced that he had identified the remains of Hatshepsut, and in 2010 it was determined that Tutankhamen was the son of Akhenaton and probably died of complications from malaria and bone disease. In 2009, facing mandatory retirement, Hawass was appointed Egypt’s vice minister of culture, with responsibility for the SCA. Among his ongoing endeavours was the repatriation of several notable Egyptian artifacts from foreign museums, including the Rosetta Stone and the bust of Nefertiti.
In January 2011, after Egyptian Pres. Hosni Mubarak shuffled his cabinet following antigovernment protests, Hawass was appointed to the newly created position of minister of antiquities. However, after protests forced Mubarak to step down as president on February 11, Hawass remained in his position for only a few weeks; he resigned in March—to protest, he said, the insufficient security for museums and archaeological sites following the onset of protests, which had resulted in looting. Less than a month after his resignation, he was reappointed by Egypt’s interim prime minister, Essam Sharaf. However, Hawass faced increasingly vocal criticism from Egyptian archaeologists who denounced his domineering management style and questioned his financial dealings. Opposition to Hawass culminated in a series of demonstrations calling for his removal, led by Egyptian archaeologists outside the headquarters of the ministry of antiquities. In July 2011 Hawass was one of more than a dozen government ministers removed from their posts in a cabinet reorganization meant to defuse widespread popular protests against the Egyptian interim government.
|
36b569ef4c70f21455680a577dca0dea | https://www.britannica.com/biography/Zak-Starkey | Zak Starkey | Zak Starkey
…Daltrey were supported by drummer Zak Starkey (son of Ringo Starr) and Townshend’s brother Simon on guitar, among others. A full-blown musical based on this material and also titled The Boy Who Heard Music premiered in July 2007 at Vassar College in Poughkeepsie, New York. The Who later performed at…
|
f796017e76aa2b963024a8cfb16b5efc | https://www.britannica.com/biography/Zane-Grey | Zane Grey | Zane Grey
Zane Grey, original name Pearl Grey, (born Jan. 31, 1872, Zanesville, Ohio, U.S.—died Oct. 23, 1939, Altadena, Calif.), prolific writer whose romantic novels of the American West largely created a new literary genre, the western.
Trained as a dentist, Grey practiced in New York City from 1898 to 1904, when he published privately a novel of pioneer life, Betty Zane, based on an ancestor’s journal. Deciding to abandon dentistry for full-time writing, he published in 1905 The Spirit of the Border—also based on Zane’s notes—which became a best-seller. Grey subsequently wrote more than 80 books, a number of which were published posthumously; more than 50 were in print in the last quarter of the 20th century. The novel Riders of the Purple Sage (1912) was the most popular; others included The Lone Star Ranger (1915), The U.P. Trail (1918), Call of the Canyon (1924), and Code of the West (1934). Prominent among his nonfiction works is Tales of Fishing (1925).
|
6d946b6c1c21bf4a83df0198a5134920 | https://www.britannica.com/biography/Zangi-Iraqi-ruler | Zangī | Zangī
Zangī, in full ʿImād al-Dīn Zangī ibn Aq Sonqur, Zangī also spelled Zengi, (born 1084—died 1146, Mosul, Iraq), Iraqi ruler who founded the Zangid dynasty and led the first important counterattacks against the Crusader kingdoms in the Middle East.
When Zangī’s father, the governor of Aleppo, was killed in 1094, Zangī fled to Mosul. He served the Seljuq dynasty, and in 1126 the Seljuq sultan, Maḥmūd II, appointed Zangī governor of Basra. When the ʿAbbasid caliph al-Mustarshid rebelled in 1127, Zangī supported the sultan, and the victorious Maḥmūd II rewarded Zangī by giving him the governorship of Mosul. Next, the key city of Aleppo submitted to Zangī’s authority to secure military protection against a possible Frankish Crusader conquest.
Zangī thus came to exercise authority over a considerable geographic area, but he wanted to create a kingdom that would also include Syria and Palestine. He was charged by the sultan with the duty of defeating the Christian Crusaders, and he saw himself as the champion of Islam. He was opposed, however, by Muslim princes who refused to accept his authority as well as by the Crusaders. To both Zangī reacted with equal harshness. By diplomacy, treachery, and warfare he steadily extended his authority, with the immediate goal of securing control of Damascus—a goal he never achieved. He did, however, capture Edessa, an important focal point of Frankish authority, in 1144—the Crusaders’ first serious setback. Zangī could not press his advantage. Returning to Iraq to repress a revolt there, he was killed by a servant who bore him a personal grudge.
|
81b632f085345819628f5ae0bb51bc5a | https://www.britannica.com/biography/Zasu-Pitts | Zasu Pitts | Zasu Pitts
Trina (played by Zasu Pitts) is a simple woman who wins a \$5,000 lottery and then finds herself caught in a love triangle characterized by greed and jealousy with her husband, McTeague (Gibson Gowland), and her former lover, Marcus (Jean Hersholt). The plot is an old standard: money…
Assorted References
|
ed8c973cdc32e47ce2bab92fdf7cd5eb | https://www.britannica.com/biography/Zayd-ibn-Ali | Zayd ibn ʿAlī | Zayd ibn ʿAlī
…these risings was led by Zayd ibn ʿAlī, a half-brother of ʿAlī’s great grandson Muḥammad al-Bāqir by ʿAlī’s son Ḥusayn. In 740, encouraged by Kufan elements, Zayd rose against the Umayyads, on the principle that the imam could lay claim to leadership only if he openly declared himself imam. Zayd…
…imam: while Zaydis believed that Zayd ibn ʿAlī should become the fifth imam because he had attained the highest degree of learning, many believed that Muḥammad al-Bāqir possessed superior ʿilm by pedigree. Zaydi Shiʿism, which survives today as the third largest sect of Shiʿi Islam, continues to view the imamate…
…Shīʿite Muslims owing allegiance to Zayd ibn ʿAlī, grandson of Ḥusayn ibn ʿAlī. Zayd was a son of the fourth Shīʿite imam, ʿAlī ibn Ḥusayn, and a brother of Muḥammad al-Bāqir. At a time when the designation and role of the Shīʿite imam was being defined, the followers of Zayd…
|
b7a455ac9e61323e5178df9738bc7e90 | https://www.britannica.com/biography/Zbigniew-Brzezinski | Zbigniew Brzezinski | Zbigniew Brzezinski
Zbigniew Brzezinski , in full Zbigniew Kazimierz Brzezinski, (born March 28, 1928, Warsaw, Poland—died May 26, 2017, Falls Church, Virginia), U.S. international relations scholar and national security adviser in the administration of Pres. Jimmy Carter who played key roles in negotiating the SALT II nuclear weapons treaty between the United States and the Soviet Union and in U.S. efforts to sustain the rule of Mohammad Reza Shah Pahlavi, the shah of Iran.
Brzezinski’s father was a prominent member of the Polish government who was appointed ambassador to Canada in 1938. When Soviet-backed communists took over the Polish government in 1945, the Brzezinski family was stranded in Canada. After this event Brzezinski harboured a deep opposition to communism and the Soviet Union.
Brzezinski studied economics and political science at McGill University in Montreal (B.A., 1948) and political science at McGill (M.A., 1950) and at Harvard University (Ph.D., 1953). He was later (1953–60) an instructor and assistant professor of government at Harvard and a research fellow and research associate of Harvard’s Russian Research Center (later the Davis Center for Russian and Eurasian Studies) and its Center for International Affairs (later the Weatherhead Center for International Affairs). He was an associate professor of public law and government at Columbia University from 1960 to 1962, when he became the first director of Columbia’s Research Institute on Communist Affairs (later the Research Institute on International Change), a position he held until 1977. During the 1960s he was also a foreign affairs adviser to Presidents John F. Kennedy and Lyndon B. Johnson. While serving as the first director (1973–76) of the Trilateral Commission, Brzezinski met Jimmy Carter, who was then the Democratic governor of Georgia, and acted as Carter’s foreign affairs adviser during his successful presidential campaign. Brzezinski served as national security adviser in the Carter administration (1977–81). Afterward he resumed teaching at Columbia (1981–89) and then served (from 1989) as senior research professor of international relations at the Paul H. Nitze School of Advanced International Studies at Johns Hopkins University.
The Carter foreign policy team achieved several major successes. In addition to playing a role in negotiating the SALT II treaty (which Carter withdrew from U.S. Senate consideration following the Soviet invasion of Afghanistan in 1979), Brzezinski helped Carter renegotiate the Panama Canal Treaty (ratified 1978) and prepare for the eventual transfer of authority over the canal to Panama. In addition, Brzezinski worked assiduously on improving U.S. relations with China. Under his guidance, the United States opened its first official embassy in the Chinese capital since the communists assumed power in 1949.
Brzezinski’s tenure as national security adviser was marked by his public disputes with the State Department. Friction between Brzezinski and Secretary of State Cyrus Vance began during the negotiations over the SALT II treaty. Both Carter and Brzezinski sought to radically expand the scope of the treaty by proposing that the Soviet Union drastically limit the number of its intercontinental ballistic missiles in exchange for limits on U.S. cruise missiles. However, Vance was not informed of this offer until he joined the negotiations. When the Soviets initially refused, Vance was publicly embarrassed.
In 1979 Brzezinski made his greatest mistake when he advocated steadfast U.S. support for the shah of Iran. Although American intelligence had questioned whether the shah could retain power during the Iranian Revolution (1978–79), Brzezinski persuaded Carter to reject the revolutionaries’ demands. Consequently, when the revolution succeeded, the United States had no contact with Iran’s new religious leaders—a situation that severely limited the diplomatic options available to the United States during the Iran hostage crisis (1979–81). The perception that Carter had mishandled the crisis strongly contributed to his defeat in the 1980 presidential election.
Brzezinski’s many books include Between Two Ages: America’s Role in the Technetronic Era (1970), in which he predicted that the United States and the Soviet Union would eventually confront each other in the developing world in a battle over natural resources; The Grand Failure: The Birth and Death of Communism in the Twentieth Century (1989); The Choice: Global Domination or Global Leadership (2004); and Strategic Vision: America and the Crisis of Global Power (2012).
|
bf1330447eccc0f2976854852ac34520 | https://www.britannica.com/biography/Zbigniew-Herbert | Zbigniew Herbert | Zbigniew Herbert
Zbigniew Herbert, (born October 29, 1924, Lwów, Poland [now Lviv, Ukraine]—died July 28, 1998, Warsaw), one of the leading Polish poets of the post-World War II generation.
Herbert attended an underground high school during the wartime German occupation of Poland and also took secret military training courses with the Polish Home Army. After World War II he earned degrees in economics, law, and philosophy at various universities in Poland. He published little poetry in 1949–54, when Socialist Realism was mandatory in Poland, but in 1955 he began a long association with the literary review Twórczość (“Creation”). Herbert’s first collection of poems, Struna światła (1956; “Chord of Light”), was followed by Hermes, pies i gwiazda (1957; “Hermes, a Dog and a Star”), Studium przedmiotu (1961; “A Study of the Object”), and such later volumes as Pan Cogito (1974; Mr. Cogito) and Raport z oblężonego miasta (1983; Report from the Besieged City and Other Poems). After travels in France and Italy between 1958 and 1961, Herbert published the essays inspired by these visits as Barbarzyńca w ogrodzie (1962; Barbarian in the Garden). From 1975 to 1992, he lived mostly in western Europe, although during that time he returned to Poland for the five years from 1981 to 1986. Then, from 1992 until his death, he made his home in Poland.
Herbert’s poetry expresses an ironic moralism in free verse laden with classical and other historical allusions. In reflecting on Poland’s traumatic experiences at the hands of the Nazis and Soviets during World War II and afterward, he uses a sarcastic rhetoric to question the gap between ideal morality and the nightmares of 20th-century totalitarianism. English translations of his poems appear in Elegy for the Departure and Other Poems and in Selected Poems (1968 and 1977). The King of the Ants: Mythological Essays (1999) comprises some of his essays.
Herbert’s poetry and his essays evoke the best traditions of antiquity, relating them to modern times in an inspiring way and showing the sources of European civilization reaching back to Greek and Roman mythology as relevant factors of modern philosophy, art, and literature.
|
b2a7ceb2cb5948580795834a1fa33546 | https://www.britannica.com/biography/Zdenek-Lev | Zdeněk Lev | Zdeněk Lev
…of loyal lords, he relieved Zdeněk Lev of Rožmitál of the office of supreme burgrave in February 1523 and appointed Prince Karel of Minstrberk, a grandson of George of Poděbrady, to that key position in provincial administration. Religious controversies that flared up soon after Martin Luther’s attack on indulgences (October…
|
a74826c27deee2166c5ed2860f09a32d | https://www.britannica.com/biography/Zebulon-B-Vance | Zebulon B. Vance | Zebulon B. Vance
Zebulon B. Vance, in full Zebulon Baird Vance, (born May 13, 1830, Buncombe county, N.C., U.S.—died April 14, 1894, Washington, D.C.), North Carolina representative, governor, and senator during the American Civil War and Reconstruction eras.
Vance studied law at the University of North Carolina and for a time practiced in Asheville. Elected in 1854 as a Whig member of the North Carolina House of Commons, Vance in 1858 won a seat in the U.S. House of Representatives, running on the Know-Nothing ticket. Upon the outbreak of the Civil War, however, he sided with the Confederacy and organized his own company of troops.
Vance ran for governor of North Carolina in 1862 and won reelection in 1864. In May 1865 he surrendered to federal military authorities and shortly thereafter was imprisoned in Washington, D.C. Vance was pardoned in 1867 and soon plunged back into politics. In 1870 he was elected to the U.S. Senate, but the Radical Republicans refused to let him take his seat.
In 1876 Vance was again elected governor of North Carolina, marking the end of the Reconstruction governments in that state. After two years of his four-year term, he was elected to the U.S. Senate and took his seat on March 18, 1879. Reelected twice (for terms beginning in 1885 and 1891), he opposed the protective tariff, the internal-revenue system, civil-service reform, and the repeal of the Sherman Silver Act. His name is not associated with any constructive legislation.
|
09348c531ea6b54e2a0198048b97e73c | https://www.britannica.com/biography/Zelda-Fitzgerald | Zelda Fitzgerald | Zelda Fitzgerald
Zelda Fitzgerald, née Zelda Sayre, (born July 24, 1900, Montgomery, Alabama, U.S.—died March 10, 1948, Asheville, North Carolina), American writer and artist, best known for personifying the carefree ideals of the 1920s flapper and for her tumultuous marriage to F. Scott Fitzgerald.
American writer and artist Zelda Fitzgerald is remembered for personifying the carefree ideals of the 1920s flapper and for her tumultuous marriage to F. Scott Fitzgerald. Her struggles with mental illness and her frustrated creative success later in life became a large part of her public profile as well.
Zelda Fitzgerald (née Sayre) married writer F. Scott Fitzgerald in April 1920. His first novel, This Side of Paradise (1920), was an immediate success, and the couple became overnight celebrities. They indulged in an extravagant lifestyle, spending lavishly on travel, parties, and liquor, and Zelda became an emblem of the 1920s liberated woman.
Zelda Fitzgerald wrote the largely autobiographical novel Save Me the Waltz (1932), and she had many other creative outlets, including writing short stories for magazines, painting, swimming, and practicing ballet.
Zelda was the youngest daughter of Alabama Supreme Court Justice Anthony Dickinson Sayre and Minnie Buckner Machen Sayre. She was a high-spirited and wayward child, and as a teen, her lack of propriety—notably flirting, drinking, and smoking—raised the eyebrows of the genteel set in her hometown.
Following her high school graduation in 1918, Zelda met F. Scott Fitzgerald at a weekend country club dance. She was a regular at such social activities, and he was an officer stationed at nearby Camp Sheridan. Scott began a courtship, but Zelda was hesitant about his financial prospects and continued to court other suitors. When he published his first novel, This Side of Paradise, in March 1920, she finally agreed to marry him, and the two wed in New York on April 3. Zelda gave birth to their only child, Frances (“Scottie”) Fitzgerald, the following year.
This Side of Paradise was an immediate success, and the couple became overnight celebrities. In rendering the youthful rebellion of the 1920s, Scott became known as the chronicler of the Jazz Age, and Zelda became an emblem of the 1920s liberated woman. They both indulged in an extravagant lifestyle, spending beyond their means on travel, parties, and liquor. In 1924 the Fitzgeralds moved to France, where they joined a group of American expatriates, led by Gerald and Sara Murphy, on the Riviera. There Scott finished his third novel, The Great Gatsby, in 1925. Although the book would later become a classic, its middling initial reception disappointed Scott. By the end of the decade, the Fitzgeralds’ already quarrelsome marriage had grown more agitated. Scott struggled to write his fourth novel, and Zelda sought creative outlets of her own, writing short stories for magazines, painting, swimming, and intensely practicing ballet, a hobby from her youth.
In 1930 Zelda had a mental breakdown and spent the next year in different European clinics. When she was released in 1931, the Fitzgeralds moved back to the United States. Zelda, however, had another breakdown in 1932 and entered Phipps Psychiatric Clinic in Baltimore, where she wrote her only novel, Save Me the Waltz (1932). The book was largely autobiographical, relating her side of the Fitzgeralds’ troubled marriage through the characters of Alabama Beggs and her painter husband, David Knight. Scott resented Zelda’s use of the same material he planned to use for his novel, and he blamed her medical bills for keeping him from finishing his own work. Save Me the Waltz, however, did not sell well, and Zelda turned to playwriting. Scandalabra, described as a “fantasy-farce,” was staged by a small theatre group in Baltimore in 1933, but its rambling banter only confused critics. Her next creative endeavour, painting, did not fare better, with a New York show in 1934 bringing ambivalent reviews.
Meanwhile, Scott finally published Tender Is the Night (1934), nearly 10 years after finishing his third novel. By this time, however, the Fitzgeralds were greatly in debt, Scott was struggling with alcoholism, and Zelda was in and out of health clinics. In 1936 Zelda entered Highland Hospital in Asheville, North Carolina, and in 1937 Scott moved to Hollywood to become a scriptwriter. He died of a heart attack there three years later at the age of 44. Zelda continued to paint and started a second novel, Caesar’s Things, but perished in a fire at Highland Hospital in 1948 before she could finish it. She never attained the creative success she eagerly sought, but she and Scott inspired numerous biographies, novels, movies, and TV series.
|
229fd1596de02806c979dcaa28636e89 | https://www.britannica.com/biography/Zeno-emperor | Zeno | Zeno
Zeno, (born, Isauria, Diocese of the East—died April 9, 491), Eastern Roman emperor whose reign (474–91) was troubled by revolts and religious dissension.
Until he married the Eastern emperor Leo I’s daughter Ariadne (in 466 or 467), Zeno had been known as Tarasicodissa. As such he led an Isaurian army that the emperor relied upon to offset the influence of German troops under the powerful patrician Aspar. In 469 Zeno was appointed consul and master of the soldiers. On the death of Leo I early in 474, Zeno’s seven-year-old son reigned as Leo II; the child died before the end of the year, after having appointed his father coemperor.
Zeno made a lasting peace with the Vandals in Africa but soon encountered difficulties at home when his most trusted adviser, the Isaurian Illus, plotted a coup d’etat with Leo I’s brother-in-law Basiliscus. The emperor, with many of his followers, was forced to flee to Isauria. Basiliscus reigned at Constantinople for 20 months, but his religious beliefs made him highly unpopular.
With the help of Illus, who changed his allegiance, Zeno returned to Constantinople in August 476. Illus, who had gained great influence in the government, raised a rebellion in Asia Minor (484) and, though severely defeated, held out against the emperor until captured and beheaded in 488. During those years Zeno also had to deal with revolts of the Ostrogoths under Theodoric. By appointing Theodoric to replace Odoacer as king of Italy (489), Zeno was able to persuade the Ostrogoths to leave the Eastern Empire.
Although the rest of Zeno’s reign was free from revolts and invasions, there were bitter disputes between the Christians who accepted the Council of Chalcedon (451) affirming Christ had distinct divine and human natures and the miaphysites, an opposing faction that believed the divine and human natures were one in Christ. The emperor sought to reconcile the two groups with his letter, the Henotikon, addressed to the church in Egypt (482). The doctrines expressed in this document were acceptable to the miaphysites and brought a measure of religious peace to the East, but they resulted in a schism with the church at Rome that lasted from 484 to 519.
|
82fcaab190d30762655e22923bc73283 | https://www.britannica.com/biography/Zenobia | Zenobia | Zenobia
Zenobia, in full Septimia Zenobia, Aramaic Znwbyā Bat Zabbai, (died after 274), queen of the Roman colony of Palmyra, in present-day Syria, from 267 or 268 to 272. She conquered several of Rome’s eastern provinces before she was subjugated by the emperor Aurelian (ruled 270–275).
Zenobia’s husband, Odaenathus, Rome’s client ruler of Palmyra, had by 267 recovered the Roman East from Persian conquerors. After Odaenathus and his eldest son (by his former wife), Herodes (or Herodianus), were assassinated in 267 or 268, Zenobia became regent for her own young son Wahballat (called Vaballathus in Latin, Athenodorus in Greek). Styling herself queen of Palmyra, she had Vaballathus adopt his father’s titles of “king of kings” and corrector totius Orientis (“governor of all the East”).
Nevertheless, unlike Odaenathus, Zenobia was not content to remain a Roman client. In 269 she seized Egypt, then conquered much of Asia Minor and declared her independence from Rome. Marching east, Aurelian defeated her armies at Antioch (now Antakya, Turkey) and at Emesa (now Ḥimṣ, Syria) and besieged Palmyra. Zenobia and Vaballathus tried to flee from the city, but they were captured before they could cross the Euphrates River, and the Palmyrenes soon surrendered. When they revolted again in 273, the Romans recaptured and destroyed the city. Sources differ about Zenobia’s fate after her capture. According to some, Zenobia and Vaballathus graced the triumphal procession that Aurelian celebrated at Rome in 274. However, other historians claim that she starved herself to death during the trip to Rome.
|
5fcc97ec730657b611ee1273a95a2d3f | https://www.britannica.com/biography/Zenodotus-of-Ephesus | Zenodotus Of Ephesus | Zenodotus Of Ephesus
Zenodotus Of Ephesus, (flourished 3rd century bc), Greek grammarian and first superintendent (from c. 284 bc) of the library at Alexandria, noted for editions of Greek poets and especially for producing the first critical edition of Homer.
Zenodotus lived during the reigns of the first two Ptolemies and was a pupil of Philetas of Cos. While serving as superintendent of the library at Alexandria, he directed the work of editing the Greek epic and perhaps the lyric poets. After comparing different manuscripts of Homer, he deleted doubtful lines, transposed others, made emendations, and divided the Iliad and the Odyssey into 24 books each.
Zenodotus’ edition—knowledge of which is derived almost entirely from later scholia on Homer—was severely attacked for its subjectivity by later scholars, notably one of his successors at the library, Aristarchus of Samothrace (c. 217–c. 145 bc) who modified Zenodotus’ work.
Zenodotus also compiled a Homeric glossary, edited the Theogony of Hesiod, and published studies of Pindar and Anacreon, traces of which survive in a papyrus from Oxyrhyncus. He is also said to have written epic poetry.
|
f6aff179a3bffdac6c5e6c3b9b0dc5e8 | https://www.britannica.com/biography/Zeppo-Marx | Zeppo Marx | Zeppo Marx
…1977, Palm Springs, California), and Zeppo (original name Herbert Marx; b. February 25, 1901, New York City—d. November 30, 1979, Palm Springs).
Assorted Referencesdiscussed in biographyMarx Brothers
|
36a8a3709c112b1539e7d555a8e99e80 | https://www.britannica.com/biography/Zhang-Daqian | Zhang Daqian | Zhang Daqian
Zhang Daqian, Wade-Giles romanization Chang Ta-ch’ien, (born May 10, 1899, Neijiang, Sichuan province, China—died April 2, 1983, Taipei, Taiwan), painter and collector who was one of the most internationally renowned Chinese artists of the 20th century.
As a child, Zhang was encouraged by his family to pursue painting. In 1917 his elder brother, Zhang Shanzi (an artist famous for his tiger paintings), accompanied him to Kyoto, Japan, to study textile dyeing. Two years later, Zhang Daqian went to Shanghai to receive traditional painting instruction from two famous calligraphers and painters of the time, Zeng Xi and Li Ruiqing. Through his association with these teachers, Zhang had the opportunity to study some works by ancient masters in detail. His early style attempted to emulate the Ming-Qing Individualists, including Tang Yin, Chen Hongshou, and Shitao. He meticulously studied and copied their works and began to make forgeries; he gained notoriety when one of his forged Shitaos successfully deceived the connoisseurs.
After his early success in Shanghai, Zhang extended his career to the north in the late 1920s, when he became active in the cultural circles of Beijing. He began to collaborate with the well-known Beijing painter Pu Xinyu, and together they became known as the “South Zhang and North Pu,” an epithet that is still used to refer to their collaborative works of the 1930s.
In 1940 Zhang led a group of artists to the caves of Mogao and Yulin for the purpose of copying their Buddhist wall paintings. The group completed over 200 paintings, and the experience left Zhang with a repository of religious imagery. During the Sino-Japanese War, the artist zealously studied traditional Tang-Song figure painting and ancient monumental landscape painting. He would use elements of these in his own work, becoming particularly known for his lotus paintings, inspired by ancient works. His love of tradition was also reflected in his personal collection of ancient Chinese paintings, which he began early in his career. At its peak, his collection contained several hundred works from the Tang to Qing dynasties.
In reaction to the political climate in 1949, Zhang left China in the early 1950s. He resided in various places, including Mendoza, Argentina; São Paulo, Brazil; and Carmel, California. His meeting with Pablo Picasso in 1956 in Nice, France, was publicized as an artistic meeting between East and West.
Zhang developed eye problems in the late 1950s. As his eyesight deteriorated, he developed his mature splashed colour (pocai) style. Although he attributed this style in part to the splashed-ink technique of the ancient painter Wang Mo, many believe it to be related to that of the Abstract Expressionist movement then popular in the United States and a departure from that of his traditional paintings. Zhang’s splashed-colour paintings fetched the highest market prices for contemporary Chinese paintings at international auctions of the time.
In 1978 the artist settled in Taipei, Taiwan. His residence, Moye-jingshe, next to the National Palace Museum, is now the Memorial Museum of Zhang Daqian.
|
87b75fa17ce3369fa748b9182b094d94 | https://www.britannica.com/biography/Zhang-Xueliang | Zhang Xueliang | Zhang Xueliang
Zhang Xueliang, Wade-Giles romanization Chang Hsüeh-liang, courtesy name Hanqing, byname Shaoshuai (“Young Marshal”), (born June 3, 1901, Haicheng, Liaoning province, China—died October 14, 2001, Honolulu, Hawaii, U.S.), Chinese warlord who, together with Yang Hucheng, in the Xi’an Incident (1936), compelled the Nationalist leader Chiang Kai-shek (Jiang Jieshi) to form a wartime alliance with the Chinese communists against Japan.
Zhang Xueliang was the oldest son of the warlord Zhang Zuolin, who dominated Manchuria (now Northeast China) and parts of North China. The younger Zhang was prepared for a military career and joined his father’s army at age 20. Rising swiftly through the ranks, he was promoted to the command of one of his father’s armies in 1922. Upon Zhang Zuolin’s murder by Japanese officers in 1928, Zhang Xueliang assumed control of Manchuria and, ignoring both the warnings and the growing power of the Japanese in Manchuria, aligned himself with the newly formed Nationalist government at Nanjing. The Japanese then drove his forces from Manchuria and occupied the region; Zhang withdrew his troops into Shaanxi province in northwestern China.
It was in Shaanxi in 1935–36 that Chiang Kai-shek used Zhang’s troops in his military campaigns against the Chinese communists based in nearby Yan’an. However, the increasingly patriotic Zhang became convinced that his military units and those of the Nationalists should be fighting the Japanese invaders, not their fellow Chinese. When Chiang Kai-shek came to Zhang Xueliang’s headquarters at Xi’an in Shaanxi in 1936 to take personal charge of the Nationalist war against the Chinese communists, Zhang arrested the Nationalist leader. He released him only when Chiang Kai-shek agreed to form a United Front with the Chinese communists against the Japanese. Unwisely returning to Nanjing with Chiang Kai-shek, Zhang was soon placed under house arrest. When Chiang’s government fled to Taiwan in 1948, Zhang was taken there and continued to be kept under house arrest. Although the government reportedly lifted house arrest in the early 1960s, Zhang remained at his home near Taipei until 1991, when he traveled to the United States. In 1994 he settled in Hawaii.
|
86357d94387b4a98a11de95292b25210 | https://www.britannica.com/biography/Zhang-Zai | Zhang Zai | Zhang Zai
Zhang Zai, Wade-Giles romanization Chang Tsai, (born 1020, Changan, China—died 1077, China), realist philosopher of the Song dynasty, a leader in giving neo-Confucianism a metaphysical and epistemological foundation.
The son of a magistrate, Zhang studied Buddhism and Daoism but found his true inspiration in the Confucian Classics. In his chief work, Zhengmeng (“Correcting Youthful Ignorance”), he declared that the world is a unity, with myriad aspects, and all existence is a process of arising and dissolving. Qi (“vital breath”) is identified with the Great Ultimate (taiji), the ultimate reality. When qi is influenced by yang forces, it floats and rises, dispersing its vapours. When the yin forces are prevalent, qi sinks and falls, thus condensing and forming the concrete things of the material world.
In the realm of ethics, the one basic virtue is ren (“humaneness”), but in its various manifestations (i.e., in various human relations) ren becomes many things: filial piety toward parents or respect for an elder brother. Human beings are qi, like all other aspects of the world, and have an original nature that is one with all the things of the world. Their physical nature, however, derives from the physical form into which their qi has been dispersed. Moral self-cultivation consists in a person’s attempting to do his duty as a member of society and as a member of the cosmos. One does not try to prolong or extend one’s life. The exemplary person understands that “life entails no gain nor death any loss.”
Zhang influenced some of the most eminent later neo-Confucian thinkers; the brothers Cheng Hao (1032–85) and Cheng Yi (1033–1107) were his pupils. His theory of mind was adopted by the great philosopher Zhu Xi (1130–1200), and Wang Fuzhi (1619–92) developed Zhang’s philosophy into a system that has recently come to be recognized as one of the major achievements of Chinese thought.
|
22a2582768ddbaed3ad6cc03f69d0d02 | https://www.britannica.com/biography/Zhang-Zhongjing | Zhang Zhongjing | Zhang Zhongjing
Zhang Zhongjing, Wade-Giles romanization Chang Chung-ching, (born c. 150 ce—died c. 219), Chinese physician who wrote in the early 3rd century ce a work titled Shang han za bing lun (Treatise on Febrile and Other Diseases), which greatly influenced the practice of traditional Chinese medicine. The original work was later edited and divided into two books, Shang han lun (Treatise on Febrile Diseases) and Jin gui yao lue (Jingui Collection of Prescriptions). Today, Zhang’s work remains highly regarded and important in the practice of Chinese medicine, and he is often referred to as the Chinese Hippocrates.
Zhang’s Treatise was an important book on dietetics and was especially influential for its information on typhoid and other fevers. Zhang’s work was revered in the East for as long a time as Greek physician Galen of Pergamum’s works were popular in the West. Zhang described typhoid clearly and recommended the use of only a few potent drugs in treating it. The drugs were to be used one at a time, a considerable advance from the shotgun prescriptions then common. Zhang stated that cool baths were also an important part of the treatment, an idea that remained unused for 1,700 years until Scottish physician James Currie promoted it in his famous treatise on fever therapy.
Zhang paid close attention to the physical signs, symptoms, kind, and course of a disease, and he carefully recorded the results obtained from any drugs that he prescribed. He forthrightly stood for the dignity and responsibility of the medical profession, and this attitude, coupled with his close powers of observation, make it easy to understand why he has become known by the name of his Greek medical ancestor Hippocrates. In the 16th and 17th centuries there was a strong revival of his teachings and practices.
|
6e1f572bc8872b2d5f966e66e1d79487 | https://www.britannica.com/biography/Zhao-Gao | Zhao Gao | Zhao Gao
Zhao Gao, Wade-Giles romanization Chao Kao, (born, Zhao state, China—died 207 bce, China), Chinese eunuch who conspired to seize power on the death of Shihuangdi, first emperor of the Qin dynasty (221–207 bce). His action eventually led to the downfall of the dynasty.
As the chief eunuch to Shihuangdi, Zhao Gao handled all the emperor’s communications with the outside world, so that he had no difficulty in concealing Shihuangdi’s death while on a trip in 210 bce. The emperor’s eldest son was in exile on the northern frontier because he had opposed the measures of the minister Li Si to burn all books as a means of proscribing heterodox thought. The emperor’s last orders were contained in a sealed letter to his eldest son, whom he named heir apparent. Fearing that the crown prince, if he succeeded to the throne, would have them dismissed and probably killed, Li and Zhao forged a letter to the prince and his companion Meng Tian, the commander of the army of the north, ordering them to commit suicide. The forgery was not immediately discovered, and the two men died. Li and Zhao hastened to return to the capital with the dead emperor, concealing the malodorous corpse in a wagon load of salt fish attached to the rear of the imperial carriage. The conspirators then forged a decree that called for the emperor’s infant son, Hu Hai, to ascend the throne.
Li and Zhao soon turned on one another, and Zhao had Li executed. Rebellions thereupon erupted throughout the country, and the rebels soon marched on the capital. Zhao executed the puppet sovereign and set another man on the throne, whom he also attempted to execute. His plot discovered, Zhao was assassinated as he entered the palace.
|
66b9c1b7a67ec307a2ce1eeb4e803cb0 | https://www.britannica.com/biography/Zheliu-Zhelev | Zheliu Zhelev | Zheliu Zhelev
Zheliu Zhelev, in full Zheliu Mitev Zhelev, (born March 3, 1935, Veselinovo, Bulgaria—died January 30, 2015, Sofia), Bulgarian dissident and politician who served as president of Bulgaria from 1990 to 1997.
Zhelev graduated with a degree in philosophy from St. Clement of Ohrid University of Sofia (1958). In 1965 he was expelled from the university and the Bulgarian Communist Party (BCP) after he refused to alter his dissertation to remove its criticism of Leninism. He was subsequently (1966) banished from Sofia, spending the next six years unemployed. He finally obtained a Ph.D. in philosophy with a new dissertation in 1974, when he was appointed a researcher for the Bulgarian State Institute of Culture, an office of the Bulgarian Foreign Ministry that engaged in cultural diplomacy. He later (1977–82) served as head of the institute’s culture and personality department. He was awarded a Doctor of Science degree in philosophy in 1988.
From the early 1980s Zhelev was a prominent figure within Bulgaria’s small dissident movement. His scholarly book Fascism (written in 1967) was removed from bookstores and banned only three weeks after its publication in 1982 when authorities realized that its critique of fascist regimes applied equally to the communist governments of eastern Europe (the book’s original title was The Totalitarian State). Clandestinely circulated, it reached a wide audience in the country.
After working as an environmental activist in the industrial city of Ruse, Zhelev became founding member and chair of the country’s first dissident organization, the Club for Support of Glasnost and Perestroika (1989). Following the forced resignation of the country’s longtime leader, Todor Zhivkov, in November 1989, Zhelev was elected the first chair of the Union of Democratic Forces (UDF), a loose association of dissident groups and revived prewar political parties dedicated to bringing about democratic reform in Bulgaria. In June he won a seat in a Grand National Assembly to draft a new constitution, and in August the Assembly elected him head of state (president) of Bulgaria. In parliamentary and presidential elections held under the new constitution in October 1991 and January 1992, respectively, the UDF’s new leader, Philip Dimitrov, became prime minister and Zhelev became president for a five-year term. His influence as president was limited, however, by dissension within the UDF and by the election in 1994 of a government led by the Bulgarian Socialist Party, as the former BCP now called itself.
In 1996 he was defeated in UDF primary elections by Petar Stoyanov, who became Bulgaria’s next president in 1997. Zhelev later founded the Dr. Zheliu Zhelev Foundation (1997) and the Balkan Political Club (2001), an organization of intellectuals and current and former political leaders in southeastern Europe.
|
3fa40fe170199f4bcf53b9ae8e2450b5 | https://www.britannica.com/biography/Zheng-Zhilong | Zheng Zhilong | Zheng Zhilong
Zheng Zhilong, Wade-Giles romanization Cheng Chih-lung, original name Zheng Yiguan, also called Iquan, (born 1604, Nan’an, Fujian province, China—died Nov. 24, 1661, Beijing), Chinese pirate leader who achieved great power in the transitional period between the Ming (1368–1644) and Qing (1644–1911/12) dynasties.
As a boy, Zheng found employment with the Europeans in the Portuguese settlement at Macau, where he was baptized and given the Christian name of Nicholas Gaspard. After leaving Macau, he joined a pirate band that preyed on Dutch and Chinese trade. In 1628 he was induced by the government to help defend the coast against both the Dutch and the pirates. He soon acquired great wealth and power.
When the capital of the Ming dynasty at Beijing was captured in 1644 by the Manchu of Manchuria (who founded the Qing dynasty), Zheng set up the Prince of Tang, or Zhu Yujian, in Fujian province in South China as the claimant to the Ming throne. Two years later, when the Manchu army achieved a sweeping victory in central China, Zheng again changed sides and was given titles and high office by the Qing government. But Zheng’s son, Zheng Chenggong (also known as Koxinga), the famous pirate leader who controlled the island of Formosa (Taiwan), refused to surrender to Qing forces. As a result, Zheng was imprisoned and stripped of all rank in 1655. He was executed in 1661 for his son’s stubborn refusal to surrender.
|
e0fb11b1a04d5694a6569da9d3a18b9f | https://www.britannica.com/biography/Zhou-Zuoren | Zhou Zuoren | Zhou Zuoren
Zhou Zuoren, Wade-Giles romanization Chou Tso-jen, original name Zhou Kuishou, (born January 16, 1885, Shaoxing, Zhejiang province, China—died May 6, 1967, Beijing), Chinese essayist, critic, and literary scholar who translated fiction and myths from many languages into vernacular Chinese. He was the most important Chinese essayist of the 1920s and 1930s.
Zhou Zuoren, who was the younger brother of the renowned writer Zhou Shuren (literary name [hao] Lu Xun), received a classical education. In 1906 the two brothers went to Japan, where Zhou Zuoren studied Japanese language and literature, Classical Greek literature, and English literature. He translated and published, together with Lu Xun, a collection of European fiction, selecting works to stimulate the people of China with the examples of others who had rebelled under oppressive rule.
Zhou and his Japanese wife returned to China in 1911. He became a professor at Peking University in 1917 and began writing the essays that won him renown. Among his favourite topics were the need for language reform and the use of the vernacular; he also advocated what he termed a “humane” literature and praised the realism of Western writers. His collections of translations—from Greek, Roman, Russian, and Japanese literature—continued to be published as his popularity as an authority in foreign literature increased.
Because he remained in Beijing during the Sino-Japanese War (1937–45) and worked for a Japanese-sponsored bureau of education, Zhou was tried as a collaborator by the National Government after the war ended and was condemned to death. His sentence was commuted to imprisonment, and he received a full pardon in 1949, which permitted him to continue his research. After the communist takeover that same year, he returned to Beijing, where he continued to write and translate.
|
bb3fe7419215fd458525268b92b5287d | https://www.britannica.com/biography/Zhu-Da | Zhu Da | Zhu Da
Zhu Da, Wade-Giles romanization Chu Ta, or literary name (hao) Bada Shanren, (born c. 1625, Nanchang, Jiangxi province, China—died c. 1705), Buddhist monk who was, with Shitao, one of the most famous Individualist painters of the early Qing period.
Details of Zhu’s life are unclear, but he is known to have been a descendant of the Ming imperial line, to have had a classical education, and to have become a Buddhist monk in 1648, after the collapse of the Ming dynasty. Possibly the fall of that dynasty and the death of his father at about the same time caused him some psychic disturbance, and he may have hovered between real insanity and impassioned creativity. He eventually left the Buddhist cloister and exhibited wildly erratic behaviour—such as writing the character for “dumb” (ya) and attaching it to his door and then refusing to speak.
In his paintings, usually in ink monochrome, such creatures as birds and fishes are given a curious, glowering, sometimes even perverse personality. He used an abbreviated, wet style that, while deceptively simple, captures the very essence of the flowers, plants, and creatures he portrays. He also painted landscapes in a dashing shorthand inspired by the 10th-century masters Dong Yuan and Juran. Unlike most Chinese painters, he does not easily fit into any traditional category; in character and personality he was the complete eccentric and “individualist.”
|
dce31112bffaacb8e1e19752c6ecc11c | https://www.britannica.com/biography/Zidantas-II | Zidantas II | Zidantas II
…by which a Hittite king—presumably Zidantas II or Huzziyas—paid tribute to the pharaoh in return for certain frontier adjustments, but it is not clear to what extent Syria was dominated by Thutmose III between 1471 and his death. During this period the national unity of the Hurrians seems to have…
|
be9adbd53195daac14faa5c7173ba1f5 | https://www.britannica.com/biography/Ziya-al-Din-Barani | Ẕiyāʾ al-Dīn Baranī | Ẕiyāʾ al-Dīn Baranī
Ẕiyāʾ al-Dīn Baranī, Baranī also spelled Barni, (born 1285, India—died after 1357), the first known Muslim to write a history of India. He resided for 17 years at Delhi as nadim (boon companion) of Sultan Muḥammad ibn Tughluq.
Using mainly hearsay evidence and his personal experiences at court, Baranī in 1357 wrote the Tārīkh-e Fīrūz Shāhī (“History of Fīrūz Shāh”), a didactic work setting down the duties of the Indian sultan toward Islam. In his Fatawā-ye jahāndārī (“Rulings on Temporal Government”), influenced by Sufī mysticism, he expounded a religious philosophy of history that viewed the events in the lives of great men as manifestations of divine providence. According to Baranī, the Delhi sultans from Ghiyās̄ al-Dīn Balban (reigned 1266–87) to Fīrūz Shah Tughluq (reigned from 1351) who had followed his guidelines for the good Islamic ruler had prospered, while those who had deviated from those precepts had failed.
|
8fe7da2529fd267ec47136bc008b12a6 | https://www.britannica.com/biography/Ziyadat-Allah-I | Ziyādat Allāh I | Ziyādat Allāh I
…[3 km] south of Kairouan); Ziyādat Allāh I (817–838), who broke the rebellion of the Arab soldiery and sent it to conquer Sicily (which remained in Arab hands for two centuries); and Abū Ibrāhim Aḥmad (856–863), who commissioned many public works. During the 9th century the brilliant Kairouan civilization evolved…
|
4fc2e588d66ef8ee209d706e7a652303 | https://www.britannica.com/biography/Zola-Taylor | Zola Taylor | Zola Taylor
…1992, New York, New York), Zola Taylor (b. March 17, 1934/38, Los Angeles, California, U.S.—d. April 30, 2007, Riverside, California), David Lynch (b. July 3, 1929, St. Louis, Missouri, U.S.—d. January 2, 1981, Long Beach, California), Paul Robi (b. August 30, 1931, New Orleans, Louisiana, U.S.—d. February 1, 1989, Los…
|
56e6733b2433cc2fc661b406cfa33433 | https://www.britannica.com/biography/Zoltan-Huszarik | Zoltán Huszárik | Zoltán Huszárik
Zoltán Huszárik, Hungarian form Huszárik Zoltán, (born May 14, 1931, Domony, Hung.—died Oct. 15, 1981, Budapest), Hungarian filmmaker who directed numerous poetic short films and two feature films, the best-known of which is Szindbád (1971; “Sinbad”).
Huszárik studied directing at the School of Film and Dramatic Arts in Budapest from 1949 to 1952. He was expelled, however, probably because his widowed mother was a kulak, or member of the wealthy peasant class who were treated as enemies of the state by the communist government. Huszárik supported himself with temporary jobs but lived virtually at the poverty level, painting in his spare time. In 1957 he returned to the film industry as a set inspector. In 1959 he was allowed to continue his studies at the School of Film and Dramatic Arts, and he earned a diploma in 1961. He first worked as a production assistant. Then, in 1965, at the experimental Béla Balázs Studio, he made his first short film, Elégia (1965; “Elegy”). The film documents free-roaming horses on the Hungarian plain and how they end up becoming beasts of burden and are eventually slaughtered.
Szindbád, based on Gyula Krúdy’s short novels from the turn of the 20th century, was released in 1971. The film is unusual in that it has virtually no plot and focuses instead on the personality of its protagonist, played by one of Hungarian cinema’s best-known actors, Zoltán Latinovics, who delivered a particularly memorable performance. The film was well received by audiences and critics alike, which helped make it possible for Huszárik to make the short films Capriccio, Amerigo Tot, Tisztelet az öregasszonyoknak, and A piacere (“As You Like It”). In 1979 he completed his second feature film, Csontváry, a tribute to painter Tivadar Csontváry-Kosztka. Its lack of success may have fueled the self-destructive lifestyle that soon after resulted in Huszárik’s death.
|
040fee693a2f77d098208b0312278957 | https://www.britannica.com/biography/Zoran-Zaev | Zoran Zaev | Zoran Zaev
…Tsipras and Macedonian Prime Minister Zoran Zaev announced that they had reached an agreement (which became known as the Prespa Agreement by virtue of its signing on the banks of Lake Prespa) under which Macedonia would be known both domestically and internationally as the Republic of North Macedonia. According to…
…2017 when the SDSM leader Zoran Zaev won the support of ethnic Albanian parties by promising to support legislation that would extend existing constitutional language rights to make Albanian the country’s second official language. (An amendment to the constitution in response to the Ohrid Framework Agreement had made Albanian an…
…June 2018 Macedonian Prime Minister Zoran Zaev and Greek Prime Minister Alexis Tsipras announced that an agreement (later known as the Prespa Agreement) had been reached under which Macedonia would be known both domestically and internationally as the Republic of North Macedonia or as North Macedonia for short (Macedonian: Severna…
Finally, in June 2018, Prime Minister Zaev and Greek Prime Minister Alexis Tsipras announced that an agreement (thereafter known as the Prespa Agreement) had been reached under which Macedonia would be known both domestically and internationally as the Republic of North Macedonia (Macedonian: Severna Makedonija). The name change required both…
…he and Macedonian Prime Minister Zoran Zaev announced that they had reached an understanding regarding the long-standing dispute over Macedonia’s name. Under the agreement (which became known as the Prespa Agreement by virtue of its signing on the banks of Lake Prespa), Macedonia would be known both domestically and internationally…
|
ffa7c1d957207b9c041f92deef8d0bcc | https://www.britannica.com/biography/Zou-Yan | Zou Yan | Zou Yan
Zou Yan, Wade-Giles romanization Tsou Yen, (born 340—died 260? bce), Chinese cosmologist of the ancient state of Qi (in present-day Shandong) and leading exponent of the Yinyang school. The only account of his life is a brief one in the Shiji (“Record of the Historian”). To him is attributed the association of the Five Phases (wuxing) theory with the doctrine of yinyang. Nature was thought to consist of changing combinations of the Five Phases (metal, wood, water, fire, earth), which were governed by the basic polarity of the cosmic principles of yin (earth, female, passive, absorbing) and yang (heaven, male, active, penetrating).
|
c6e9a68b5469ef72157917da9bf3f714 | https://www.britannica.com/biography/Zuhayr-ibn-Abi-Sulma | Zuhayr ibn Abī Sulmā | Zuhayr ibn Abī Sulmā
Zuhayr ibn Abī Sulmā, (born c. 520—died c. 609, Najd region, Arabia), one of the greatest of the Arab poets of pre-Islamic times, best known for his long ode in the Muʿallaqāt collection.
Zuhayr was from the Muzaynah tribe but lived among the Ghaṭafān. Zuhayr’s father was a poet, his first wife the sister of a poet, and two of his sons were poets. The elder son, Kaʿb, is famous for the poem he recited for the Prophet Muhammad, thereby signalling his acceptance of Islam. Zuhayr’s poem in Al-Muʿallaqāt praises the men who brought peace between the clans of ʿAbs and Dhubyān. In the poem, war is compared to a millstone that grinds those who set it moving, and the poet speaks as one who from a long life has learned humankind’s need for morality. Zuhayr’s extant poetry, available in several Arabic editions, includes other poems of praise and satires.
|
e70a34961bbcfdd6c77846bd165bf317 | https://www.britannica.com/biography/Zuhayr-ibn-Qays-al-Balawi | Zuhayr ibn Qays al-Balawī | Zuhayr ibn Qays al-Balawī
The first, commanded by Zuhayr ibn Qays al-Balawī, reoccupied Kairouan, then pursued Kusaylah westward to Mams, where he was defeated and killed. The dates of these operations are uncertain, but they must have occurred before 688 when Zuhayr ibn Qays himself was killed in an attack on Byzantine positions…
|
a6bc341a83bdfb4f3d5dc83fa216f00e | https://www.smithsonianmag.com/history/10-fun-facts-about-original-patriots-180962032/ | Ten Fun Facts About the Original Patriots | Ten Fun Facts About the Original Patriots
The New England Patriots may not have gained their name until 1960, or their mascot until shortly thereafter (thanks to Phil Bissell’s cartoon for the Boston Globe, earning him the sobriquet “Pat’s Pa”), but the history of their mascot stretches hundreds of years back into American history. Whether you’re more history buff than sports fan or you just want to revisit the Revolutionary War, we’ve got 10 fun facts about patriots to get you ready for the big game.
Ben Franklin popularized the label “patriot”
The term “patriot” was first used regularly by Benjamin Franklin in the years leading up to the war, and came to refer to those colonial soldiers fighting against the British Army for their independence (Franklin himself was a patriot as well, and he also championed American foods like cranberries, maple syrup and Indian corn).
Though the romantic version of the Revolutionary War would have us believe that the Patriots—those fighting against Loyalists or Tories for independence from Britain—were ideological soldier-farmers, General George Washington actually relied on poor laborers motivated to join the army because they were offered money and land for their service. By 1778, half the men in the Continental Army weren’t even of English descent. But the pay soldiers were promised often wasn’t forthcoming, and even Continental officers went months without being paid.
Taking sides could tear families apart
Patriot Timothy Pickering Jr. was an adjutant general in Washington’s Continental Army, while his father remained a staunch Tory till the end of his life. When the younger Pickering learned of his father’s imminent death, he wrote a letter to his father to thank him for his example, even when their opinions differed. “When I look back on past time, I regret our difference of sentiment in great as well as (sometimes) in little politics; as it was a deduction from the happiness otherwise to have been enjoyed.”
Even in war, pets were important to patriots
After the 1777 Battle of Brandywine, in which the Patriots were defeated by the British, Washington found a dog sniffing around the camp. It wore tags identifying it as the property of British General William Howe and was returned to him with a note likely penned by Alexander Hamilton: “General Washington’s compliments to General Howe. He does himself the pleasure to return him dog, which accidentally fell into his hands.”
Some patriots were pirates
Although Britain had the most powerful navy in the world in 1776, patriot forces managed to recruit privateers—armed ships commissioned by the government to attack foreign powers—to fight for the fledgling country. Nearly 800 vessels were commissioned, and they ultimately captured or destroyed approximately 600 British ships. Though an American navy could’ve never defeated their British counterpart, it’s estimated that the privateers caused about $18 million in damage to British shipping by the end of the war—over $302 million in today’s dollars.
Theatre Was a Topic of Controversy
When they weren’t busy fighting patriots, the British army found some unusual methods for staving off boredom—including turning to the dramatic arts. As the British army spread across New York City, Boston and Philadelphia, three men were charged with overseeing military theatrical companies: General John Burgoyne, General William Howe and General Henry Clinton. The plays staged by the army were inevitably politically charged, with soldiers portraying George Washington as a bumbling, uncouth figure and offering flattery for the British soldiers. Plenty of people at the time found the soldiers’ involvement in theater unusual, or even offensive, since they didn’t seem to be taking the war seriously. The soldiers were aware of the criticism, as proved by British fighter Thomas Stanley: “I hear a great many people blame us for acting, and think we might have found something better to do.”
Ironically, the First Continental Congress actually discouraged “exhibitions of shows, plays and other expensive diversions and entertainments” in 1774, which could be related to the colonies’ injunctions against theatrical performances for reasons of religious morality or for economic reasons. But not everyone agreed with the article, and in May 1778 George Washington actually approved performances by officers in the Continental Army.
George Washington had a network of spies
Washington has a reputation as a great general and exemplary first president, but he was also heralded for his work as a spymaster known as Agent 711 in the Culper Spy Ring. The undercover patriots included farmers, tailors, merchants and other ordinary men as well as military officials. The ring was directed by Benjamin Tallmadge or “John Bolton,” who created a complex system of coded messages for the operatives.
The spies listened in on British conversations in locations all over the colonies, and in 1780 uncovered the British soldiers’ plan to ambush French troops. Washington also encouraged members of the ring to spread misinformation about the size of his army among British supporters. Agent 711’s work was so successful that one British officer said, “Washington did not really outfight the British. He simply out-spied us.”
One patriot survived 500 lashings at the hands of the British
Daniel Morgan was an infamous guerrilla fighter during the Revolutionary War, disguising himself and his men as Native Americans and attacking British units then fleeing throughout 1777. But it was before the Revolutionary War that Morgan’s fiery reputation truly proved itself. While serving the British Army as a wagoner during the French and Indian War, Morgan was struck by a British Lieutenant and responded by knocking the man out. Morgan was court-martialed and received 500 lashes, enough to kill a man. He survived, and liked to tell people that the British had miscounted and only given him 499, and they owed him one more lashing.
There were women patriots, too
There may not be any women playing for the New England Patriots, but there were plenty of female patriots who assisted the Continental Army.
When Margaret Cochran married John Corbin in 1772, little did she anticipate that in the next four years she’d be joining her husband in the Revolutionary War. When John left, she followed, joining other women who cooked, did laundry and took care of the sick and wounded soldiers. In November 1776, Margaret dressed as a man to join her husband at the Battle of Fort Washington, assisting him with loading the cannon. He was killed, leaving her to take over firing the cannon. But Margaret, too, was hit, her left arm nearly severed and her jaw severely wounded. She survived the battle, which the British eventually won, and on July 6, 1779, was awarded a lifelong pension equivalent to half that received by male soldiers, becoming the first female combat veteran of the war to receive a military pension.
One of the most critical battles was fought in the South, not New England
In January 1781, South Carolina became the site of a major turning point in the Revolutionary War. Cowpens referred to South Carolina’s pastureland and young cattle industry, and the land meant there was plenty of forage for horses. Some of the troops in the Continental Army were familiar with the terrain and made use of it for setting up their camps. On January 17, the Battle of Cowpens began—and was a major success for the patriots, thanks to help from spy and messenger Catherine Moore Barry. Barry knew the trails well and notified the militia of the approaching British Army, which helped General Morgan lay a trap for Cornwallis and the British troops.
Native Americans largely supported the British
The Revolutionary War wasn’t a battle for an unoccupied stretch of land; Native Americans had been negotiating the politics of the competing European powers for centuries by the time the colonists fought for independence from the British. But the Native Americans were far from being monolithic when it came to where they stood in the war. Mohawks and other members of the Iroquois Confederacy fought for the British in the northeast, while tribes in the Ohio country tried to remain neutral. In 1778 at the Treaty of Fort Pitt, the Delawares and Americans agreed to “perpetual peace and friendship.” But when the patriots killed noncombatant Moravian Delawares, the Ohio Native Americans joined the British, and continued to fight American westward expansion long after the war.
Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
|
d9d81786d23ddd390dd4a0c7024416d9 | https://www.smithsonianmag.com/history/158-resources-understanding-systemic-racism-america-180975029/?fbclid=IwAR0cAOwrG8ciDT2O2vmpR08qFWDnwdNkSI-LTnEiyjuXg-DAYU3sIHFWxt0 | In a short essay published earlier this week, Smithsonian Secretary Lonnie G. Bunch wrote that the recent killing in Minnesota of George Floyd has forced the country to “confront the reality that, despite gains made in the past 50 years, we are still a nation riven by inequality and racial division.”
Amid escalating clashes between protesters and police, discussing race—from the inequity embedded in American institutions to the United States’ long, painful history of anti-black violence—is an essential step in sparking meaningful societal change. To support those struggling to begin these difficult conversations, the Smithsonian’s National Museum of African American History and Culture recently launched a “Talking About Race” portal featuring “tools and guidance” for educators, parents, caregivers and other people committed to equity.
“Talking About Race” joins a vast trove of resources from the Smithsonian Institution dedicated to understanding what Bunch describes as America’s “tortured racial past.” From Smithsonian magazine articles on slavery’s Trail of Tears and the disturbing resilience of scientific racism to the National Museum of American History’s collection of Black History Month resources for educators and a Sidedoor podcast on the Tulsa Race Massacre, these 158 resources are designed to foster an equal society, encourage commitment to unbiased choices and promote antiracism in all aspects of life. Listings are bolded and organized by category.
1. Historical Context
2. Systemic Inequality
3. Anti-Black Violence
4. Protest
5. Intersectionality
6. Allyship and Education
Between 1525 and 1866, 12.5 million people were kidnapped from Africa and sent to the Americas through the transatlantic slave trade. Only 10.7 million survived the harrowing two month journey. Comprehending the sheer scale of this forced migration—and slavery’s subsequent spread across the country via interregional trade—can be a daunting task, but as historian Leslie Harris told Smithsonian’s Amy Crawford earlier this year, framing “these big concepts in terms of individual lives … can [help you] better understand what these things mean.”
Take, for instance, the story of John Casor. Originally an indentured servant of African descent, Casor lost a 1654 or 1655 court case convened to determine whether his contract had lapsed. He became the first individual declared a slave for life in the United States. Manuel Vidau, a Yoruba man who was captured and sold to traders some 200 years after Casor’s enslavement, later shared an account of his life with the British and Foreign Anti-Slavery Society, which documented his remarkable story—after a decade of enslavement in Cuba, he purchased a share in a lottery ticket and won enough money to buy his freedom—in records now available on the digital database “Freedom Narratives.” (A separate, similarly document-based online resource emphasizes individuals described in fugitive slave ads, which historian Joshua Rothman describes as “sort of a little biography” providing insights on their subjects’ appearance and attire.)
Finally, consider the life of Matilda McCrear, the last known survivor of the transatlantic slave trade. Kidnapped from West Africa and brought to the U.S. on the Clotilda, she arrived in Mobile, Alabama, in July 1860—more than 50 years after Congress had outlawed the import of enslaved labor. McCrear, who died in 1940 at the age of 81 or 82, “displayed a determined, even defiant streak” in her later life, wrote Brigit Katz earlier this year. She refused to use her former owner’s last name, wore her hair in traditional Yoruba style and had a decades-long relationship with a white German man.
How American society remembers and teaches the horrors of slavery is crucial. But as recent studies have shown, many textbooks offer a sanitized view of this history, focusing solely on “positive” stories about black leaders like Harriet Tubman and Frederick Douglass. Prior to 2018, Texas schools even taught that states’ rights and sectionalism—not slavery—were the main causes of the Civil War. And, in Confederate memorials across the country, writes historian Kevin M. Levin, enslaved individuals are often falsely portrayed as loyal slaves.
Accurately representing slavery might require an updated vocabulary, argued historian Michael Landis in 2015: Outdated “[t]erms like ‘compromise’ or ‘plantation’ served either to reassure worried Americans in a Cold War world, or uphold a white supremacist, sexist interpretation of the past.” Rather than referring to the Compromise of 1850, call it the Appeasement of 1850—a term that better describes “the uneven nature of the agreement,” according to Landis. Smithsonian scholar Christopher Wilson wrote, too, that widespread framing of the Civil War as a battle between equal entities lends legitimacy to the Confederacy, which was not a nation in its own right, but an “illegitimate rebellion and unrecognized political entity.” A 2018 Smithsonian magazine investigation found that the literal costs of the Confederacy are immense: In the decade prior, American taxpayers contributed $40 million to the maintenance of Confederate monuments and heritage organizations.
To better understand the immense brutality ingrained in enslaved individuals’ everyday lives, read up on Louisiana’s Whitney Plantation Museum, which acts as “part reminder of the scars of institutional bondage, part mausoleum for dozens of enslaved people who worked (and died) in [its] sugar fields, … [and] monument to the terror of slavery,” as Jared Keller observed in 2016. Visitors begin their tour in a historic church populated by clay sculptures of children who died on the plantation’s grounds, then move on to a series of granite slabs engraved with hundreds of enslaved African Americans’ names. Scattered throughout the experience are stories of the violence inflicted by overseers.
The Whitney Plantation Museum is at the forefront of a vanguard of historical sites working to confront their racist pasts. In recent years, exhibitions, oral history projects and other initiatives have highlighted the enslaved people whose labor powered such landmarks as Mount Vernon, the White House and Monticello. At the same time, historians are increasingly calling attention to major historical figures’ own slave-holding legacies: From Thomas Jefferson to George Washington, William Clark of Lewis and Clark, Francis Scott Key, and other Founding Fathers, many American icons were complicit in upholding the institution of slavery. Washington, Jefferson, James Madison and Aaron Burr, among others, sexually abused enslaved females working in their households and had oft-overlooked biracial families.
Though Abraham Lincoln issued the Emancipation Proclamation on January 1, 1863, the decree took two-and-a-half years to fully enact. June 19, 1865—the day Union Gen. Gordon Granger informed the enslaved individuals of Galveston, Texas, that they were officially free—is now known as Juneteenth: America’s “second independence day,” according to NMAAHC. Initially celebrated mainly in Texas, Juneteenth spread across the country as African Americans fled the South in what is now called the Great Migration.
At the onset of that mass movement in 1916, 90 percent of African Americans still lived in the South, where they were “held captive by the virtual slavery of sharecropping and debt peonage and isolated from the rest of the country,” as Isabel Wilkerson wrote in 2016. (Sharecropping, a system in which formerly enslaved people became tenant farmers and lived in “converted” slave cabins, was the impetus for the 1919 Elaine Massacre, which found white soldiers collaborating with local vigilantes to kill at least 200 sharecroppers who dared to criticize their low wages.) By the time the Great Migration—famously chronicled by artist Jacob Lawrence—ended in the 1970s, 47 percent of African Americans called the northern and western United States home.
The third season of Sidedoor explored a South Carolina residence’s unique journey from slave cabin to family home and its latest incarnation as a centerpiece at the National Museum of African American History and Culture.
Conditions outside the Deep South were more favorable than those within the region, but the “hostility and hierarchies that fed the Southern caste system” remained major obstacles for black migrants in all areas of the country, according to Wilkerson. Low-paying jobs, redlining, restrictive housing covenants and rampant discrimination limited opportunities, creating inequality that would eventually give rise to the civil rights movement.
“The Great Migration was the first big step that the nation’s servant class ever took without asking,” Wilkerson explained. “ … It was about agency for a people who had been denied it, who had geography as the only tool at their disposal. It was an expression of faith, despite the terrors they had survived, that the country whose wealth had been created by their ancestors’ unpaid labor might do right by them.”
Racial, economic and educational disparities are deeply entrenched in U.S. institutions. Though the Declaration of Independence states that “all men are created equal,” American democracy has historically—and often violently—excluded certain groups. “Democracy means everybody can participate, it means you are sharing power with people you don’t know, don’t understand, might not even like,” said National Museum of American History curator Harry Rubenstein in 2017. “That’s the bargain. And some people over time have felt very threatened by that notion.”
Instances of inequality range from the obvious to less overtly discriminatory policies and belief systems. Historical examples of the former include poll taxes that effectively disenfranchised African American voters; the marginalization of African American soldiers who fought in World War I and World War II but were treated like second-class citizens at home; black innovators who were barred from filing patents for their inventions; white medical professionals’ exploitation of black women’s bodies (see Henrietta Lacks and J. Marion Sims); Richard and Mildred Loving’s decade-long fight to legalize interracial marriage; the segregated nature of travel in the Jim Crow era; the government-mandated segregation of American cities; and segregation in schools.
Among the most heartbreaking examples of structural racism’s subtle effects are accounts shared by black children. In the late 1970s, when Lebert F. Lester II was 8 or 9 years old, he started building a sand castle during a trip to the Connecticut shore. A young white girl joined him but was quickly taken away by her father. Lester recalled the girl returning, only to ask him, “Why don’t [you] just go in the water and wash it off?” Lester says., “I was so confused—I only figured out later she meant my complexion.” Two decades earlier, in 1957, 15-year-old Minnijean Brown had arrived at Little Rock Central High School with high hopes of “making friends, going to dances and singing in the chorus.” Instead, she and the rest of the Little Rock Nine—a group of black students selected to attend the formerly all-white academy after Brown v. Board of Education desegregated public schools—were subjected to daily verbal and physical assaults. Around the same time, photographer John G. Zimmerman captured snapshots of racial politics in the South that included comparisons of black families waiting in long lines for polio inoculations as white children received speedy treatment.
In 1968, the Kerner Commission, a group convened by President Lyndon Johnson, found that white racism, not black anger, was the impetus for the widespread civil unrest sweeping the nation. As Alice George wrote in 2018, the commission’s report suggested that “[b]ad policing practices, a flawed justice system, unscrupulous consumer credit practices, poor or inadequate housing, high unemployment, voter suppression and other culturally embedded forms of racial discrimination all converged to propel violent upheaval.” Few listened to the findings, let alone its suggestion of aggressive government spending aimed at leveling the playing field. Instead, the country embraced a different cause: space travel. The day after the 1969 moon landing, the leading black paper the New York Amsterdam News ran a story stating, “Yesterday, the moon. Tomorrow, maybe us.”
Fifty years after the Kerner Report’s release, a separate study assessed how much had changed; it concluded that conditions had actually worsened. In 2017, black unemployment was higher than in 1968, as was the rate of incarcerated individuals who were black. The wealth gap had also increased substantially, with the median white family having ten times more wealth than the median black family. “We are resegregating our cities and our schools, condemning millions of kids to inferior education and taking away their real possibility of getting out of poverty,” said Fred Harris, the last surviving member of the Kerner Commission, following the 2018 study’s release.
Today, scientific racism—grounded in such faulty practices as eugenics and the treatment of race “as a crude proxy for myriad social and environmental factors,” writes Ramin Skibba—persists despite overwhelming evidence that race has only social, not biological, meaning. Black scholars including Mamie Phipps Clark, a psychologist whose research on racial identity in children helped end segregation in schools, and Rebecca J. Cole, a 19th-century physician and advocate who challenged the idea that black communities were destined for death and disease, have helped overturn some of these biases. But a 2015 survey found that 48 percent of black and Latina women scientists, respectively, still report being mistaken for custodial or administrative staff. Even artificial intelligence exhibits racial biases, many of which are introduced by lab staff and crowdsourced workers who program their own conscious and unconscious opinions into algorithms.
In addition to enduring centuries of enslavement, exploitation and inequality, African Americans have long been the targets of racially charged physical violence. Per the Alabama-based Equal Justice Initiative, more than 4,400 lynchings—mob killings undertaken without legal authority—took place in the U.S. between the end of Reconstruction and World War II.
Incredibly, the Senate only passed legislation declaring lynching a federal crime in 2018. Between 1918 and the Justice for Victims of Lynching Act’s eventual passage, more than 200 anti-lynching bills failed to make it through Congress. (Earlier this week, Sen. Rand Paul said he would hold up a separate, similarly intentioned bill over fears that its definition of lynching was too broad. The House passed the bill in a 410-to-4 vote this February.) Also in 2018, the Equal Justice Initiative opened the nation’s first monument to African American lynching victims. The six-acre memorial site stands alongside a museum dedicated to tracing the nation’s history of racial bias and persecution from slavery to the present.
One of the earliest instances of Reconstruction-era racial violence took place in Opelousas, Louisiana, in September 1868. Two months ahead of the presidential election, Southern white Democrats started terrorizing Republican opponents who appeared poised to secure victory at the polls. On September 28, a group of men attacked 18-year-old schoolteacher Emerson Bentley, who had already attracted ire for teaching African American students, after he published an account of local Democrats’ intimidation of Republicans. Bentley escaped with his life, but 27 of the 29 African Americans who arrived on the scene to help him were summarily executed. Over the next two weeks, vigilante terror led to the deaths of some 250 people, the majority of whom were black.
In April 1873, another spate of violence rocked Louisiana. The Colfax Massacre, described by historian Eric Foner as the “bloodiest single instance of racial carnage in the Reconstruction era,” unfolded under similar circumstances as Opelousas, with tensions between Democrats and Republicans culminating in the deaths of between 60 and 150 African Americans, as well as three white men.
Between the turn of the 20th century and the 1920s, multiple massacres broke out in response to false allegations that young black men had raped or otherwise assaulted white women. In August 1908, a mob terrorized African American neighborhoods across Springfield, Illinois, vandalizing black-owned businesses, setting fire to the homes of black residents, beating those unable to flee and lynching at least two people. Local authorities, argues historian Roberta Senechal, were “ineffectual at best, complicit at worst.”
False accusations also sparked a July 1919 race riot in Washington, D.C. and the Tulsa Race Massacre of 1921, which was most recently dramatized in the HBO series “Watchmen.” As African American History Museum curator Paul Gardullo tells Smithsonian, tensions related to Tulsa’s economy underpinned the violence: Forced to settle on what was thought to be worthless land, African Americans and Native Americans struck oil and proceeded to transform the Greenwood neighborhood of Tulsa into a prosperous community known as “Black Wall Street.” According to Gardullo, “It was the frustration of poor whites not knowing what to do with a successful black community, and in coalition with the city government [they] were given permission to do what they did.”
Over the course of two days in spring 1921, the Tulsa Race Massacre claimed the lives of an estimated 300 black Tulsans and displaced another 10,000. Mobs burned down at least 1,256 residences, churches, schools and businesses and destroyed almost 40 blocks of Greenwood. As the Sidedoor episode “Confronting the Past” notes, “No one knows how many people died, no one was ever convicted, and no one really talked about it nearly a century later.”
The second season of Sidedoor told the story of the Tulsa Race Massacre of 1921.
Economic injustice also led to the East St. Louis Race War of 1917. This labor dispute-turned-deadly found “people’s houses being set ablaze, … people being shot when they tried to flee, some trying to swim to the other side of the Mississippi while being shot at by white mobs with rifles, others being dragged out of street cars and beaten and hanged from street lamps,” recalled Dhati Kennedy, the son of a survivor who witnessed the devastation firsthand. Official counts place the death toll at 39 black and 9 white individuals, but locals argue that the real toll was closer to 100.
A watershed moment for the burgeoning civil rights movement was the 1955 murder of 14-year-old Emmett Till. Accused of whistling at a white woman while visiting family members in Mississippi, he was kidnapped, tortured and killed. Emmett’s mother, Mamie Till Mobley, decided to give her son an open-casket funeral, forcing the world to confront the image of his disfigured, decomposing body. (Visuals, including photographs, movies, television clips and artwork, played a key role in advancing the movement.) The two white men responsible for Till’s murder were acquitted by an all-white jury. A marker at the site where the teenager’s body was recovered has been vandalized at least three times since its placement in 2007.
The form of anti-black violence with the most striking parallels to contemporary conversations is police brutality. As Katie Nodjimbadem reported in 2017, a regional crime survey of late 1920s Chicago and Cook County, Illinois, found that while African Americans constituted just 5 percent of the area’s population, they made up 30 percent of the victims of police killings. Civil rights protests exacerbated tensions between African Americans and police, with events like the Orangeburg Massacre of 1968, in which law enforcement officers shot and killed three student activists at South Carolina State College, and the Glenville shootout, which left three police officers, three black nationalists and one civilian dead, fostering mistrust between the two groups.
Today, this legacy is exemplified by broken windows policing, a controversial approach that encourages racial profiling and targets African American and Latino communities. “What we see is a continuation of an unequal relationship that has been exacerbated, made worse if you will, by the militarization and the increase in fire power of police forces around the country,” William Pretzer, senior curator at NMAAHC, told Smithsonian in 2017.
The history of protest and revolt in the United States is inextricably linked with the racial violence detailed above.
Prior to the Civil War, enslaved individuals rarely revolted outright. Nat Turner, whose 1831 insurrection ended in his execution, was one of the rare exceptions. A fervent Christian, he drew inspiration from the Bible. His personal copy, now housed in the collections of the African American History Museum, represented the “possibility of something else for himself and for those around him,” curator Mary Ellis told Smithsonian’s Victoria Dawson in 2016.
Other enslaved African Americans practiced less risky forms of resistance, including working slowly, breaking tools and setting objects on fire. “Slave rebellions, though few and small in size in America, were invariably bloody,” wrote Dawson. “Indeed, death was all but certain.”
One of the few successful uprisings of the period was the Creole Rebellion. In the fall of 1841, 128 enslaved African Americans traveling aboard The Creole mutinied against its crew, forcing their former captors to sail the brig to the British West Indies, where slavery was abolished and they could gain immediate freedom.
An April 1712 revolt found enslaved New Yorkers setting fire to white-owned buildings and firing on slaveholders. Quickly outnumbered, the group fled but was tracked to a nearby swamp; though several members were spared, the majority were publicly executed, and in the years following the uprising, the city enacted laws limiting enslaved individuals’ already scant freedom. In 1811, meanwhile, more than 500 African Americans marched on New Orleans while chanting “Freedom or Death.” Though the German Coast uprising was brutally suppressed, historian Daniel Rasmussen argues that it “had been much larger—and come much closer to succeeding—than the planters and American officials let on.”
Some 150 years after what Rasmussen deems America’s “largest slave revolt,” the civil rights movement ushered in a different kind of protest. In 1955, police arrested Rosa Parks for refusing to yield her bus seat to a white passenger (“I had been pushed around all my life and felt at this moment that I couldn’t take it any more,” she later wrote). The ensuing Montgomery bus boycott, in which black passengers refused to ride public transit until officials met their demands, led the Supreme Court to rule segregated buses unconstitutional. Five years later, the Greensboro Four similarly took a stand, ironically by staging a sit-in at a Woolworth’s lunch counter. As Christopher Wilson wrote ahead of the 60th anniversary of the event, “What made Greensboro different [from other sit-ins] was how it grew from a courageous moment to a revolutionary movement.”
During the 1950s and ’60s, civil rights leaders adopted varying approaches to protest: Malcolm X, a staunch proponent of black nationalism who called for equality by “any means necessary,” “made tangible the anger and frustration of African Americans who were simply catching hell,” according to journalist Allison Keyes. He repeated the same argument “over and over again,” wrote academic and activist Cornel West in 2015: “What do you think you would do after 400 years of slavery and Jim Crow and lynching? Do you think you would respond nonviolently? What’s your history like? Let’s look at how you have responded when you were oppressed. George Washington—revolutionary guerrilla fighter!’”
Martin Luther King Jr. famously advocated for nonviolent protest, albeit not in the form that many think. As biographer Taylor Branch told Smithsonian in 2015, King’s understanding of nonviolence was more complex than is commonly argued. Unlike Mahatma Gandhi’s “passive resistance,” King believed resistance “depended on being active, using demonstrations, direct actions, to ‘amplify the message’ of the protest they were making,” according to Ron Rosenbaum. In the activist’s own words, “[A] riot is the language of the unheard. And what is it America has failed to hear?… It has failed to hear that the promises of freedom and justice have not been met. ”
Another key player in the civil rights movement, the militant Black Panther Party, celebrated black power and operated under a philosophy of “demands and aspirations.” The group’s Ten-Point Program called for an “immediate end to POLICE BRUTALITY and MURDER of Black people,” as well as more controversial measures like freeing all black prisoners and exempting black men from military service. Per NMAAHC, black power “emphasized black self-reliance and self-determination more than integration,” calling for the creation of separate African American political and cultural organizations. In doing so, the movement ensured that its proponents would attract the unwelcome attention of the FBI and other government agencies.
Many of the protests now viewed as emblematic of the fight for racial justice took place in the 1960s. On August 28, 1963, more than 250,000 people gathered in D.C. for the March on Washington for Jobs and Freedom. Ahead of the 50th anniversary of the march, activists who attended the event detailed the experience for a Smithsonian oral history: Entertainer Harry Belafonte observed, “We had to seize the opportunity and make our voices heard. Make those who are comfortable with our oppression—make them uncomfortable—Dr. King said that was the purpose of this mission,” while Representative John Lewis recalled, “Looking toward Union Station, we saw a sea of humanity; hundreds, thousands of people. … People literally pushed us, carried us all the way, until we reached the Washington Monument and then we walked on to the Lincoln Memorial..”
Two years after the March on Washington, King and other activists organized a march from Selma, Alabama, to the state capital of Montgomery. Later called the Selma March, the protest was dramatized in a 2014 film starring David Oyelowo as MLK. (Reflecting on Selma, Smithsonian Secretary Lonnie Bunch, then-director of NMAAHC, deemed it a “remarkable film” that “does not privilege the white perspective … [or] use the movement as a convenient backdrop for a conventional story.”)
Organized in response to the manifest obstacles black individuals faced when attempting to vote, the Selma March actually consisted of three separate protests. The first of these, held on March 7, 1965, ended in a tragedy now known as Bloody Sunday. As peaceful protesters gathered on the Edmund Pettus Bridge—named for a Confederate general and local Ku Klux Klan leader—law enforcement officers attacked them with tear gas and clubs. One week later, President Lyndon B. Johnson offered the Selma protesters his support and introduced legislation aimed at expanding voting rights. During the third and final march, organized in the aftermath of Johnson’s announcement, tens of thousands of protesters (protected by the National Guard and personally led by King) converged on Montgomery. Along the way, interior designer Carl Benkert used a hidden reel-to-reel tape recorder to document the sounds—and specifically songs—of the event.
The protests of the early and mid-1960s culminated in the widespread unrest of 1967 and 1968. For five days in July 1967, riots on a scale unseen since 1863 rocked the city of Detroit: As Lorraine Boissoneault writes, “Looters prowled the streets, arsonists set buildings on fire, civilian snipers took position from rooftops and police shot and arrested citizens indiscriminately.” Systemic injustice in such areas as housing, jobs and education contributed to the uprising, but police brutality was the driving factor behind the violence. By the end of the riots, 43 people were dead. Hundreds sustained injuries, and more than 7,000 were arrested.
The Detroit riots of 1967 prefaced the seismic changes of 1968. As Matthew Twombly wrote in 2018, movements including the Vietnam War, the Cold War, civil rights, human rights and youth culture “exploded with force in 1968,” triggering aftershocks that would resonate both in America and abroad for decades to come.
On February 1, black sanitation workers Echol Cole and Robert Walker died in a gruesome accident involving a malfunctioning garbage truck. Their deaths, compounded by Mayor Henry Loeb’s refusal to negotiate with labor representatives, led to the outbreak of the Memphis sanitation workers’ strike—an event remembered both “as an example of powerless African Americans standing up for themselves” and as the backdrop to King’s April 4 assassination.
Though King is lionized today, he was highly unpopular at the time of his death. According to a Harris Poll conducted in early 1968, nearly 75 percent of Americans disapproved of the civil rights leader, who had become increasingly vocal in his criticism of the Vietnam War and economic inequity. Despite the public’s seeming ambivalence toward King—and his family’s calls for nonviolence—his murder sparked violent protests across the country. In all, the Holy Week Uprisings spread to nearly 200 cities, leaving 3,500 people injured and 43 dead. Roughly 27,000 protesters were arrested, and 54 of the cities involved sustained more than $100,000 in property damage.
In May, thousands flocked to Washington, D.C. for a protest King had planned prior to his death. Called the Poor People’s Campaign, the event united racial groups from all quarters of America in a call for economic justice. Attendees constructed “Resurrection City,” a temporary settlement made up of 3,000 wooden tents, and camped out on the National Mall for 42 days.
“While we were all in a kind of depressed state about the assassinations of King and RFK, we were trying to keep our spirits up, and keep focused on King’s ideals of humanitarian issues, the elimination of poverty and freedom,” protester Lenneal Henderson told Smithsonian in 2018. “It was exciting to be part of something that potentially, at least, could make a difference in the lives of so many people who were in poverty around the country.”
Racial unrest persisted throughout the year, with uprisings on the Fourth of July, a protest at the Summer Olympic Games, and massacres at Orangeburg and Glenville testifying to the tumultuous state of the nation.
The Black Lives Matter marches organized in response to the killings of George Floyd, Philando Castile, Freddie Gray, Eric Garner, Sandra Bland, Trayvon Martin, Michael Brown and other victims of anti-black violence share many parallels with protests of the past.
Football player Colin Kaepernick’s decision to kneel during the national anthem—and the unmitigated outrage it sparked—bears similarities to the story of boxer Muhammad Ali, historian Jonathan Eig told Smithsonian in 2017: “It’s been eerie to watch it, that we’re still having these debates that black athletes should be expected to shut their mouths and perform for us,” he said. “That’s what people told Ali 50 years ago.”
Other aspects of modern protest draw directly on uprisings of earlier eras. In 2016, for instance, artist Dread Scott updated an anti-lynching poster used by the National Association for the Advancement of Colored People (NAACP) in the 1920s and ’30s to read “A Black Man Was Lynched by Police Yesterday.” (Scott added the words “by police.”)
Though the civil rights movement is often viewed as the result of a cohesive “grand plan” or “manifestation of the vision of the few leaders whose names we know,” the American History Museum’s Christopher Wilson argues that “the truth is there wasn’t one, there were many and they were often competitive.”
Meaningful change required a whirlwind of revolution, adds Wilson, “but also the slow legal march. It took boycotts, petitions, news coverage, civil disobedience, marches, lawsuits, shrewd political maneuvering, fundraising, and even the violent terror campaign of the movement’s opponents—all going on [at] the same time.”
In layman’s terms, intersectionality refers to the multifaceted discrimination experienced by individuals who belong to multiple minority groups. As theorist Kimberlé Crenshaw explains in a video published by NMAAHC, these classifications run the gamut from race to gender, gender identity, class, sexuality and disability. A black woman who identifies as a lesbian, for instance, may face prejudice based on her race, gender or sexuality.
Crenshaw, who coined the term intersectionality in 1989, explains the concept best: “Consider an intersection made up of many roads,” she says in the video. “The roads are the structures of race, gender, gender identity, class, sexuality, disability. And the traffic running through those roads are the practices and policies that discriminate against people. Now if an accident happens, it can be caused by cars traveling in any number of directions, and sometimes, from all of them. So if a black woman is harmed because she is in an intersection, her injury could result from discrimination from any or all directions.”
Understanding intersectionality is essential for teasing out the relationships between movements including civil rights, LGBTQ rights, suffrage and feminism. Consider the contributions of black transgender activists Marsha P. Johnson and Sylvia Rivera, who played pivotal roles in the Stonewall Uprising; gay civil rights leader Bayard Rustin, who was only posthumously pardoned this year for having consensual sex with men; the “rank and file” women of the Black Panther Party; and African American suffragists such as Mary Church Terrell and Nannie Helen Burroughs.
All of these individuals fought discrimination on multiple levels: As noted in “Votes for Women: A Portrait of Persistence,” a 2019 exhibition at the National Portrait Gallery, leading suffrage organizations initially excluded black suffragists from their ranks, driving the emergence of separate suffrage movements and, eventually, black feminists grounded in the inseparable experiences of racism, sexism and classism.
Individuals striving to become better allies by educating themselves and taking decisive action have an array of options for getting started. Begin with NMAAHC’s “Talking About Race” portal, which features sections on being antiracist, whiteness, bias, social identities and systems of oppression, self-care, race and racial identity, the historical foundations of race, and community building. An additional 139 items—from a lecture on the history of racism in America to a handout on white supremacy culture and an article on the school-to-prison pipeline—are available to explore via the portal’s resources page.
In collaboration with the International Coalition of Sites of Conscience, the National Museum of the American Indian has created a toolkit that aims to “help people facilitate new conversations with and among students about the power of images and words, the challenges of memory, and the relationship between personal and national value,” says museum director Kevin Gover in a statement. The Smithsonian Asian Pacific American Center offers a similarly focused resource called “Standing Together Against Xenophobia.” As the site’s description notes, “This includes addressing not only the hatred and violence that has recently targeted people of Asian descent, but also the xenophobia that plagues our society during times of national crisis.”
Ahead of NMAAHC’s official opening in 2016, the museum hosted a series of public programs titled “History, Rebellion, and Reconciliation.” Panels included “Ferguson: What Does This Moment Mean for America?” and “#Words Matter: Making Revolution Irresistible.” As Smithsonian reported at the time, “It was somewhat of a refrain at the symposium that museums can provide ‘safe,’ or even ‘sacred’ spaces, within which visitors [can] wrestle with difficult and complex topics.” Then-director Lonnie Bunch expanded on this mindset in an interview, telling Smithsonian, “Our job is to be an educational institution that uses history and culture not only to look back, not only to help us understand today, but to point us towards what we can become.” For more context on the museum’s collections, mission and place in American history, visit Smithsonian’s “Breaking Ground” hub and NMAAHC’s digital resources guide.
Historical examples of allyship offer both inspiration and cautionary tales for the present. Take, for example, Albert Einstein, who famously criticized segregation as a “disease of white people” and continually used his platform to denounce racism. (The scientist’s advocacy is admittedly complicated by travel diaries that reveal his deeply troubling views on race.)
Einstein’s near-contemporary, a white novelist named John Howard Griffin, took his supposed allyship one step further, darkening his skin and embarking on a “human odyssey through the South,” as Bruce Watson wrote in 2011. Griffin’s chronicle of his experience, a volume titled Black Like Me, became a surprise bestseller, refuting “the idea that minorities were acting out of paranoia,” according to scholar Gerald Early, and testifying to the veracity of black people’s accounts of racism.
“The only way I could see to bridge the gap between us,” wrote Griffin in Black Like Me, “was to become a Negro.”
Griffin, however, had the privilege of being able to shed his blackness at will—which he did after just one month of donning his makeup. By that point, Watson observed, Griffin could simply “stand no more.”
Sixty years later, what is perhaps most striking is just how little has changed. As Bunch reflected earlier this week, “The state of our democracy feels fragile and precarious.”
Addressing the racism and social inequity embedded in American society will be a “monumental task,” the secretary added. But “the past is replete with examples of ordinary people working together to overcome seemingly insurmountable challenges. History is a guide to a better future and demonstrates that we can become a better society—but only if we collectively demand it from each other and from the institutions responsible for administering justice.”
Editor’s Note, July 24, 2020: This article previously stated that some 3.9 million of the 10.7 million people who survived the harrowing two-month journey across the Middle Passage between 1525 and 1866 were ultimately enslaved in the United States. In fact, the 3.9 million figure refers to the number of enslaved individuals in the U.S. just before the Civil War. We regret the error.
| |
3278122c7c1989daf1171236bb80f479 | https://www.smithsonianmag.com/history/158-resources-understanding-systemic-racism-america-180975029/?fbclid=IwAR18w10aWCZFYTBvnQUSILNajedrUtkrux6QtRYc8WmySvA-AF-5Ejb3VCU | In a short essay published earlier this week, Smithsonian Secretary Lonnie G. Bunch wrote that the recent killing in Minnesota of George Floyd has forced the country to “confront the reality that, despite gains made in the past 50 years, we are still a nation riven by inequality and racial division.”
Amid escalating clashes between protesters and police, discussing race—from the inequity embedded in American institutions to the United States’ long, painful history of anti-black violence—is an essential step in sparking meaningful societal change. To support those struggling to begin these difficult conversations, the Smithsonian’s National Museum of African American History and Culture recently launched a “Talking About Race” portal featuring “tools and guidance” for educators, parents, caregivers and other people committed to equity.
“Talking About Race” joins a vast trove of resources from the Smithsonian Institution dedicated to understanding what Bunch describes as America’s “tortured racial past.” From Smithsonian magazine articles on slavery’s Trail of Tears and the disturbing resilience of scientific racism to the National Museum of American History’s collection of Black History Month resources for educators and a Sidedoor podcast on the Tulsa Race Massacre, these 158 resources are designed to foster an equal society, encourage commitment to unbiased choices and promote antiracism in all aspects of life. Listings are bolded and organized by category.
1. Historical Context
2. Systemic Inequality
3. Anti-Black Violence
4. Protest
5. Intersectionality
6. Allyship and Education
Between 1525 and 1866, 12.5 million people were kidnapped from Africa and sent to the Americas through the transatlantic slave trade. Only 10.7 million survived the harrowing two month journey. Comprehending the sheer scale of this forced migration—and slavery’s subsequent spread across the country via interregional trade—can be a daunting task, but as historian Leslie Harris told Smithsonian’s Amy Crawford earlier this year, framing “these big concepts in terms of individual lives … can [help you] better understand what these things mean.”
Take, for instance, the story of John Casor. Originally an indentured servant of African descent, Casor lost a 1654 or 1655 court case convened to determine whether his contract had lapsed. He became the first individual declared a slave for life in the United States. Manuel Vidau, a Yoruba man who was captured and sold to traders some 200 years after Casor’s enslavement, later shared an account of his life with the British and Foreign Anti-Slavery Society, which documented his remarkable story—after a decade of enslavement in Cuba, he purchased a share in a lottery ticket and won enough money to buy his freedom—in records now available on the digital database “Freedom Narratives.” (A separate, similarly document-based online resource emphasizes individuals described in fugitive slave ads, which historian Joshua Rothman describes as “sort of a little biography” providing insights on their subjects’ appearance and attire.)
Finally, consider the life of Matilda McCrear, the last known survivor of the transatlantic slave trade. Kidnapped from West Africa and brought to the U.S. on the Clotilda, she arrived in Mobile, Alabama, in July 1860—more than 50 years after Congress had outlawed the import of enslaved labor. McCrear, who died in 1940 at the age of 81 or 82, “displayed a determined, even defiant streak” in her later life, wrote Brigit Katz earlier this year. She refused to use her former owner’s last name, wore her hair in traditional Yoruba style and had a decades-long relationship with a white German man.
How American society remembers and teaches the horrors of slavery is crucial. But as recent studies have shown, many textbooks offer a sanitized view of this history, focusing solely on “positive” stories about black leaders like Harriet Tubman and Frederick Douglass. Prior to 2018, Texas schools even taught that states’ rights and sectionalism—not slavery—were the main causes of the Civil War. And, in Confederate memorials across the country, writes historian Kevin M. Levin, enslaved individuals are often falsely portrayed as loyal slaves.
Accurately representing slavery might require an updated vocabulary, argued historian Michael Landis in 2015: Outdated “[t]erms like ‘compromise’ or ‘plantation’ served either to reassure worried Americans in a Cold War world, or uphold a white supremacist, sexist interpretation of the past.” Rather than referring to the Compromise of 1850, call it the Appeasement of 1850—a term that better describes “the uneven nature of the agreement,” according to Landis. Smithsonian scholar Christopher Wilson wrote, too, that widespread framing of the Civil War as a battle between equal entities lends legitimacy to the Confederacy, which was not a nation in its own right, but an “illegitimate rebellion and unrecognized political entity.” A 2018 Smithsonian magazine investigation found that the literal costs of the Confederacy are immense: In the decade prior, American taxpayers contributed $40 million to the maintenance of Confederate monuments and heritage organizations.
To better understand the immense brutality ingrained in enslaved individuals’ everyday lives, read up on Louisiana’s Whitney Plantation Museum, which acts as “part reminder of the scars of institutional bondage, part mausoleum for dozens of enslaved people who worked (and died) in [its] sugar fields, … [and] monument to the terror of slavery,” as Jared Keller observed in 2016. Visitors begin their tour in a historic church populated by clay sculptures of children who died on the plantation’s grounds, then move on to a series of granite slabs engraved with hundreds of enslaved African Americans’ names. Scattered throughout the experience are stories of the violence inflicted by overseers.
The Whitney Plantation Museum is at the forefront of a vanguard of historical sites working to confront their racist pasts. In recent years, exhibitions, oral history projects and other initiatives have highlighted the enslaved people whose labor powered such landmarks as Mount Vernon, the White House and Monticello. At the same time, historians are increasingly calling attention to major historical figures’ own slave-holding legacies: From Thomas Jefferson to George Washington, William Clark of Lewis and Clark, Francis Scott Key, and other Founding Fathers, many American icons were complicit in upholding the institution of slavery. Washington, Jefferson, James Madison and Aaron Burr, among others, sexually abused enslaved females working in their households and had oft-overlooked biracial families.
Though Abraham Lincoln issued the Emancipation Proclamation on January 1, 1863, the decree took two-and-a-half years to fully enact. June 19, 1865—the day Union Gen. Gordon Granger informed the enslaved individuals of Galveston, Texas, that they were officially free—is now known as Juneteenth: America’s “second independence day,” according to NMAAHC. Initially celebrated mainly in Texas, Juneteenth spread across the country as African Americans fled the South in what is now called the Great Migration.
At the onset of that mass movement in 1916, 90 percent of African Americans still lived in the South, where they were “held captive by the virtual slavery of sharecropping and debt peonage and isolated from the rest of the country,” as Isabel Wilkerson wrote in 2016. (Sharecropping, a system in which formerly enslaved people became tenant farmers and lived in “converted” slave cabins, was the impetus for the 1919 Elaine Massacre, which found white soldiers collaborating with local vigilantes to kill at least 200 sharecroppers who dared to criticize their low wages.) By the time the Great Migration—famously chronicled by artist Jacob Lawrence—ended in the 1970s, 47 percent of African Americans called the northern and western United States home.
The third season of Sidedoor explored a South Carolina residence’s unique journey from slave cabin to family home and its latest incarnation as a centerpiece at the National Museum of African American History and Culture.
Conditions outside the Deep South were more favorable than those within the region, but the “hostility and hierarchies that fed the Southern caste system” remained major obstacles for black migrants in all areas of the country, according to Wilkerson. Low-paying jobs, redlining, restrictive housing covenants and rampant discrimination limited opportunities, creating inequality that would eventually give rise to the civil rights movement.
“The Great Migration was the first big step that the nation’s servant class ever took without asking,” Wilkerson explained. “ … It was about agency for a people who had been denied it, who had geography as the only tool at their disposal. It was an expression of faith, despite the terrors they had survived, that the country whose wealth had been created by their ancestors’ unpaid labor might do right by them.”
Racial, economic and educational disparities are deeply entrenched in U.S. institutions. Though the Declaration of Independence states that “all men are created equal,” American democracy has historically—and often violently—excluded certain groups. “Democracy means everybody can participate, it means you are sharing power with people you don’t know, don’t understand, might not even like,” said National Museum of American History curator Harry Rubenstein in 2017. “That’s the bargain. And some people over time have felt very threatened by that notion.”
Instances of inequality range from the obvious to less overtly discriminatory policies and belief systems. Historical examples of the former include poll taxes that effectively disenfranchised African American voters; the marginalization of African American soldiers who fought in World War I and World War II but were treated like second-class citizens at home; black innovators who were barred from filing patents for their inventions; white medical professionals’ exploitation of black women’s bodies (see Henrietta Lacks and J. Marion Sims); Richard and Mildred Loving’s decade-long fight to legalize interracial marriage; the segregated nature of travel in the Jim Crow era; the government-mandated segregation of American cities; and segregation in schools.
Among the most heartbreaking examples of structural racism’s subtle effects are accounts shared by black children. In the late 1970s, when Lebert F. Lester II was 8 or 9 years old, he started building a sand castle during a trip to the Connecticut shore. A young white girl joined him but was quickly taken away by her father. Lester recalled the girl returning, only to ask him, “Why don’t [you] just go in the water and wash it off?” Lester says., “I was so confused—I only figured out later she meant my complexion.” Two decades earlier, in 1957, 15-year-old Minnijean Brown had arrived at Little Rock Central High School with high hopes of “making friends, going to dances and singing in the chorus.” Instead, she and the rest of the Little Rock Nine—a group of black students selected to attend the formerly all-white academy after Brown v. Board of Education desegregated public schools—were subjected to daily verbal and physical assaults. Around the same time, photographer John G. Zimmerman captured snapshots of racial politics in the South that included comparisons of black families waiting in long lines for polio inoculations as white children received speedy treatment.
In 1968, the Kerner Commission, a group convened by President Lyndon Johnson, found that white racism, not black anger, was the impetus for the widespread civil unrest sweeping the nation. As Alice George wrote in 2018, the commission’s report suggested that “[b]ad policing practices, a flawed justice system, unscrupulous consumer credit practices, poor or inadequate housing, high unemployment, voter suppression and other culturally embedded forms of racial discrimination all converged to propel violent upheaval.” Few listened to the findings, let alone its suggestion of aggressive government spending aimed at leveling the playing field. Instead, the country embraced a different cause: space travel. The day after the 1969 moon landing, the leading black paper the New York Amsterdam News ran a story stating, “Yesterday, the moon. Tomorrow, maybe us.”
Fifty years after the Kerner Report’s release, a separate study assessed how much had changed; it concluded that conditions had actually worsened. In 2017, black unemployment was higher than in 1968, as was the rate of incarcerated individuals who were black. The wealth gap had also increased substantially, with the median white family having ten times more wealth than the median black family. “We are resegregating our cities and our schools, condemning millions of kids to inferior education and taking away their real possibility of getting out of poverty,” said Fred Harris, the last surviving member of the Kerner Commission, following the 2018 study’s release.
Today, scientific racism—grounded in such faulty practices as eugenics and the treatment of race “as a crude proxy for myriad social and environmental factors,” writes Ramin Skibba—persists despite overwhelming evidence that race has only social, not biological, meaning. Black scholars including Mamie Phipps Clark, a psychologist whose research on racial identity in children helped end segregation in schools, and Rebecca J. Cole, a 19th-century physician and advocate who challenged the idea that black communities were destined for death and disease, have helped overturn some of these biases. But a 2015 survey found that 48 percent of black and Latina women scientists, respectively, still report being mistaken for custodial or administrative staff. Even artificial intelligence exhibits racial biases, many of which are introduced by lab staff and crowdsourced workers who program their own conscious and unconscious opinions into algorithms.
In addition to enduring centuries of enslavement, exploitation and inequality, African Americans have long been the targets of racially charged physical violence. Per the Alabama-based Equal Justice Initiative, more than 4,400 lynchings—mob killings undertaken without legal authority—took place in the U.S. between the end of Reconstruction and World War II.
Incredibly, the Senate only passed legislation declaring lynching a federal crime in 2018. Between 1918 and the Justice for Victims of Lynching Act’s eventual passage, more than 200 anti-lynching bills failed to make it through Congress. (Earlier this week, Sen. Rand Paul said he would hold up a separate, similarly intentioned bill over fears that its definition of lynching was too broad. The House passed the bill in a 410-to-4 vote this February.) Also in 2018, the Equal Justice Initiative opened the nation’s first monument to African American lynching victims. The six-acre memorial site stands alongside a museum dedicated to tracing the nation’s history of racial bias and persecution from slavery to the present.
One of the earliest instances of Reconstruction-era racial violence took place in Opelousas, Louisiana, in September 1868. Two months ahead of the presidential election, Southern white Democrats started terrorizing Republican opponents who appeared poised to secure victory at the polls. On September 28, a group of men attacked 18-year-old schoolteacher Emerson Bentley, who had already attracted ire for teaching African American students, after he published an account of local Democrats’ intimidation of Republicans. Bentley escaped with his life, but 27 of the 29 African Americans who arrived on the scene to help him were summarily executed. Over the next two weeks, vigilante terror led to the deaths of some 250 people, the majority of whom were black.
In April 1873, another spate of violence rocked Louisiana. The Colfax Massacre, described by historian Eric Foner as the “bloodiest single instance of racial carnage in the Reconstruction era,” unfolded under similar circumstances as Opelousas, with tensions between Democrats and Republicans culminating in the deaths of between 60 and 150 African Americans, as well as three white men.
Between the turn of the 20th century and the 1920s, multiple massacres broke out in response to false allegations that young black men had raped or otherwise assaulted white women. In August 1908, a mob terrorized African American neighborhoods across Springfield, Illinois, vandalizing black-owned businesses, setting fire to the homes of black residents, beating those unable to flee and lynching at least two people. Local authorities, argues historian Roberta Senechal, were “ineffectual at best, complicit at worst.”
False accusations also sparked a July 1919 race riot in Washington, D.C. and the Tulsa Race Massacre of 1921, which was most recently dramatized in the HBO series “Watchmen.” As African American History Museum curator Paul Gardullo tells Smithsonian, tensions related to Tulsa’s economy underpinned the violence: Forced to settle on what was thought to be worthless land, African Americans and Native Americans struck oil and proceeded to transform the Greenwood neighborhood of Tulsa into a prosperous community known as “Black Wall Street.” According to Gardullo, “It was the frustration of poor whites not knowing what to do with a successful black community, and in coalition with the city government [they] were given permission to do what they did.”
Over the course of two days in spring 1921, the Tulsa Race Massacre claimed the lives of an estimated 300 black Tulsans and displaced another 10,000. Mobs burned down at least 1,256 residences, churches, schools and businesses and destroyed almost 40 blocks of Greenwood. As the Sidedoor episode “Confronting the Past” notes, “No one knows how many people died, no one was ever convicted, and no one really talked about it nearly a century later.”
The second season of Sidedoor told the story of the Tulsa Race Massacre of 1921.
Economic injustice also led to the East St. Louis Race War of 1917. This labor dispute-turned-deadly found “people’s houses being set ablaze, … people being shot when they tried to flee, some trying to swim to the other side of the Mississippi while being shot at by white mobs with rifles, others being dragged out of street cars and beaten and hanged from street lamps,” recalled Dhati Kennedy, the son of a survivor who witnessed the devastation firsthand. Official counts place the death toll at 39 black and 9 white individuals, but locals argue that the real toll was closer to 100.
A watershed moment for the burgeoning civil rights movement was the 1955 murder of 14-year-old Emmett Till. Accused of whistling at a white woman while visiting family members in Mississippi, he was kidnapped, tortured and killed. Emmett’s mother, Mamie Till Mobley, decided to give her son an open-casket funeral, forcing the world to confront the image of his disfigured, decomposing body. (Visuals, including photographs, movies, television clips and artwork, played a key role in advancing the movement.) The two white men responsible for Till’s murder were acquitted by an all-white jury. A marker at the site where the teenager’s body was recovered has been vandalized at least three times since its placement in 2007.
The form of anti-black violence with the most striking parallels to contemporary conversations is police brutality. As Katie Nodjimbadem reported in 2017, a regional crime survey of late 1920s Chicago and Cook County, Illinois, found that while African Americans constituted just 5 percent of the area’s population, they made up 30 percent of the victims of police killings. Civil rights protests exacerbated tensions between African Americans and police, with events like the Orangeburg Massacre of 1968, in which law enforcement officers shot and killed three student activists at South Carolina State College, and the Glenville shootout, which left three police officers, three black nationalists and one civilian dead, fostering mistrust between the two groups.
Today, this legacy is exemplified by broken windows policing, a controversial approach that encourages racial profiling and targets African American and Latino communities. “What we see is a continuation of an unequal relationship that has been exacerbated, made worse if you will, by the militarization and the increase in fire power of police forces around the country,” William Pretzer, senior curator at NMAAHC, told Smithsonian in 2017.
The history of protest and revolt in the United States is inextricably linked with the racial violence detailed above.
Prior to the Civil War, enslaved individuals rarely revolted outright. Nat Turner, whose 1831 insurrection ended in his execution, was one of the rare exceptions. A fervent Christian, he drew inspiration from the Bible. His personal copy, now housed in the collections of the African American History Museum, represented the “possibility of something else for himself and for those around him,” curator Mary Ellis told Smithsonian’s Victoria Dawson in 2016.
Other enslaved African Americans practiced less risky forms of resistance, including working slowly, breaking tools and setting objects on fire. “Slave rebellions, though few and small in size in America, were invariably bloody,” wrote Dawson. “Indeed, death was all but certain.”
One of the few successful uprisings of the period was the Creole Rebellion. In the fall of 1841, 128 enslaved African Americans traveling aboard The Creole mutinied against its crew, forcing their former captors to sail the brig to the British West Indies, where slavery was abolished and they could gain immediate freedom.
An April 1712 revolt found enslaved New Yorkers setting fire to white-owned buildings and firing on slaveholders. Quickly outnumbered, the group fled but was tracked to a nearby swamp; though several members were spared, the majority were publicly executed, and in the years following the uprising, the city enacted laws limiting enslaved individuals’ already scant freedom. In 1811, meanwhile, more than 500 African Americans marched on New Orleans while chanting “Freedom or Death.” Though the German Coast uprising was brutally suppressed, historian Daniel Rasmussen argues that it “had been much larger—and come much closer to succeeding—than the planters and American officials let on.”
Some 150 years after what Rasmussen deems America’s “largest slave revolt,” the civil rights movement ushered in a different kind of protest. In 1955, police arrested Rosa Parks for refusing to yield her bus seat to a white passenger (“I had been pushed around all my life and felt at this moment that I couldn’t take it any more,” she later wrote). The ensuing Montgomery bus boycott, in which black passengers refused to ride public transit until officials met their demands, led the Supreme Court to rule segregated buses unconstitutional. Five years later, the Greensboro Four similarly took a stand, ironically by staging a sit-in at a Woolworth’s lunch counter. As Christopher Wilson wrote ahead of the 60th anniversary of the event, “What made Greensboro different [from other sit-ins] was how it grew from a courageous moment to a revolutionary movement.”
During the 1950s and ’60s, civil rights leaders adopted varying approaches to protest: Malcolm X, a staunch proponent of black nationalism who called for equality by “any means necessary,” “made tangible the anger and frustration of African Americans who were simply catching hell,” according to journalist Allison Keyes. He repeated the same argument “over and over again,” wrote academic and activist Cornel West in 2015: “What do you think you would do after 400 years of slavery and Jim Crow and lynching? Do you think you would respond nonviolently? What’s your history like? Let’s look at how you have responded when you were oppressed. George Washington—revolutionary guerrilla fighter!’”
Martin Luther King Jr. famously advocated for nonviolent protest, albeit not in the form that many think. As biographer Taylor Branch told Smithsonian in 2015, King’s understanding of nonviolence was more complex than is commonly argued. Unlike Mahatma Gandhi’s “passive resistance,” King believed resistance “depended on being active, using demonstrations, direct actions, to ‘amplify the message’ of the protest they were making,” according to Ron Rosenbaum. In the activist’s own words, “[A] riot is the language of the unheard. And what is it America has failed to hear?… It has failed to hear that the promises of freedom and justice have not been met. ”
Another key player in the civil rights movement, the militant Black Panther Party, celebrated black power and operated under a philosophy of “demands and aspirations.” The group’s Ten-Point Program called for an “immediate end to POLICE BRUTALITY and MURDER of Black people,” as well as more controversial measures like freeing all black prisoners and exempting black men from military service. Per NMAAHC, black power “emphasized black self-reliance and self-determination more than integration,” calling for the creation of separate African American political and cultural organizations. In doing so, the movement ensured that its proponents would attract the unwelcome attention of the FBI and other government agencies.
Many of the protests now viewed as emblematic of the fight for racial justice took place in the 1960s. On August 28, 1963, more than 250,000 people gathered in D.C. for the March on Washington for Jobs and Freedom. Ahead of the 50th anniversary of the march, activists who attended the event detailed the experience for a Smithsonian oral history: Entertainer Harry Belafonte observed, “We had to seize the opportunity and make our voices heard. Make those who are comfortable with our oppression—make them uncomfortable—Dr. King said that was the purpose of this mission,” while Representative John Lewis recalled, “Looking toward Union Station, we saw a sea of humanity; hundreds, thousands of people. … People literally pushed us, carried us all the way, until we reached the Washington Monument and then we walked on to the Lincoln Memorial..”
Two years after the March on Washington, King and other activists organized a march from Selma, Alabama, to the state capital of Montgomery. Later called the Selma March, the protest was dramatized in a 2014 film starring David Oyelowo as MLK. (Reflecting on Selma, Smithsonian Secretary Lonnie Bunch, then-director of NMAAHC, deemed it a “remarkable film” that “does not privilege the white perspective … [or] use the movement as a convenient backdrop for a conventional story.”)
Organized in response to the manifest obstacles black individuals faced when attempting to vote, the Selma March actually consisted of three separate protests. The first of these, held on March 7, 1965, ended in a tragedy now known as Bloody Sunday. As peaceful protesters gathered on the Edmund Pettus Bridge—named for a Confederate general and local Ku Klux Klan leader—law enforcement officers attacked them with tear gas and clubs. One week later, President Lyndon B. Johnson offered the Selma protesters his support and introduced legislation aimed at expanding voting rights. During the third and final march, organized in the aftermath of Johnson’s announcement, tens of thousands of protesters (protected by the National Guard and personally led by King) converged on Montgomery. Along the way, interior designer Carl Benkert used a hidden reel-to-reel tape recorder to document the sounds—and specifically songs—of the event.
The protests of the early and mid-1960s culminated in the widespread unrest of 1967 and 1968. For five days in July 1967, riots on a scale unseen since 1863 rocked the city of Detroit: As Lorraine Boissoneault writes, “Looters prowled the streets, arsonists set buildings on fire, civilian snipers took position from rooftops and police shot and arrested citizens indiscriminately.” Systemic injustice in such areas as housing, jobs and education contributed to the uprising, but police brutality was the driving factor behind the violence. By the end of the riots, 43 people were dead. Hundreds sustained injuries, and more than 7,000 were arrested.
The Detroit riots of 1967 prefaced the seismic changes of 1968. As Matthew Twombly wrote in 2018, movements including the Vietnam War, the Cold War, civil rights, human rights and youth culture “exploded with force in 1968,” triggering aftershocks that would resonate both in America and abroad for decades to come.
On February 1, black sanitation workers Echol Cole and Robert Walker died in a gruesome accident involving a malfunctioning garbage truck. Their deaths, compounded by Mayor Henry Loeb’s refusal to negotiate with labor representatives, led to the outbreak of the Memphis sanitation workers’ strike—an event remembered both “as an example of powerless African Americans standing up for themselves” and as the backdrop to King’s April 4 assassination.
Though King is lionized today, he was highly unpopular at the time of his death. According to a Harris Poll conducted in early 1968, nearly 75 percent of Americans disapproved of the civil rights leader, who had become increasingly vocal in his criticism of the Vietnam War and economic inequity. Despite the public’s seeming ambivalence toward King—and his family’s calls for nonviolence—his murder sparked violent protests across the country. In all, the Holy Week Uprisings spread to nearly 200 cities, leaving 3,500 people injured and 43 dead. Roughly 27,000 protesters were arrested, and 54 of the cities involved sustained more than $100,000 in property damage.
In May, thousands flocked to Washington, D.C. for a protest King had planned prior to his death. Called the Poor People’s Campaign, the event united racial groups from all quarters of America in a call for economic justice. Attendees constructed “Resurrection City,” a temporary settlement made up of 3,000 wooden tents, and camped out on the National Mall for 42 days.
“While we were all in a kind of depressed state about the assassinations of King and RFK, we were trying to keep our spirits up, and keep focused on King’s ideals of humanitarian issues, the elimination of poverty and freedom,” protester Lenneal Henderson told Smithsonian in 2018. “It was exciting to be part of something that potentially, at least, could make a difference in the lives of so many people who were in poverty around the country.”
Racial unrest persisted throughout the year, with uprisings on the Fourth of July, a protest at the Summer Olympic Games, and massacres at Orangeburg and Glenville testifying to the tumultuous state of the nation.
The Black Lives Matter marches organized in response to the killings of George Floyd, Philando Castile, Freddie Gray, Eric Garner, Sandra Bland, Trayvon Martin, Michael Brown and other victims of anti-black violence share many parallels with protests of the past.
Football player Colin Kaepernick’s decision to kneel during the national anthem—and the unmitigated outrage it sparked—bears similarities to the story of boxer Muhammad Ali, historian Jonathan Eig told Smithsonian in 2017: “It’s been eerie to watch it, that we’re still having these debates that black athletes should be expected to shut their mouths and perform for us,” he said. “That’s what people told Ali 50 years ago.”
Other aspects of modern protest draw directly on uprisings of earlier eras. In 2016, for instance, artist Dread Scott updated an anti-lynching poster used by the National Association for the Advancement of Colored People (NAACP) in the 1920s and ’30s to read “A Black Man Was Lynched by Police Yesterday.” (Scott added the words “by police.”)
Though the civil rights movement is often viewed as the result of a cohesive “grand plan” or “manifestation of the vision of the few leaders whose names we know,” the American History Museum’s Christopher Wilson argues that “the truth is there wasn’t one, there were many and they were often competitive.”
Meaningful change required a whirlwind of revolution, adds Wilson, “but also the slow legal march. It took boycotts, petitions, news coverage, civil disobedience, marches, lawsuits, shrewd political maneuvering, fundraising, and even the violent terror campaign of the movement’s opponents—all going on [at] the same time.”
In layman’s terms, intersectionality refers to the multifaceted discrimination experienced by individuals who belong to multiple minority groups. As theorist Kimberlé Crenshaw explains in a video published by NMAAHC, these classifications run the gamut from race to gender, gender identity, class, sexuality and disability. A black woman who identifies as a lesbian, for instance, may face prejudice based on her race, gender or sexuality.
Crenshaw, who coined the term intersectionality in 1989, explains the concept best: “Consider an intersection made up of many roads,” she says in the video. “The roads are the structures of race, gender, gender identity, class, sexuality, disability. And the traffic running through those roads are the practices and policies that discriminate against people. Now if an accident happens, it can be caused by cars traveling in any number of directions, and sometimes, from all of them. So if a black woman is harmed because she is in an intersection, her injury could result from discrimination from any or all directions.”
Understanding intersectionality is essential for teasing out the relationships between movements including civil rights, LGBTQ rights, suffrage and feminism. Consider the contributions of black transgender activists Marsha P. Johnson and Sylvia Rivera, who played pivotal roles in the Stonewall Uprising; gay civil rights leader Bayard Rustin, who was only posthumously pardoned this year for having consensual sex with men; the “rank and file” women of the Black Panther Party; and African American suffragists such as Mary Church Terrell and Nannie Helen Burroughs.
All of these individuals fought discrimination on multiple levels: As noted in “Votes for Women: A Portrait of Persistence,” a 2019 exhibition at the National Portrait Gallery, leading suffrage organizations initially excluded black suffragists from their ranks, driving the emergence of separate suffrage movements and, eventually, black feminists grounded in the inseparable experiences of racism, sexism and classism.
Individuals striving to become better allies by educating themselves and taking decisive action have an array of options for getting started. Begin with NMAAHC’s “Talking About Race” portal, which features sections on being antiracist, whiteness, bias, social identities and systems of oppression, self-care, race and racial identity, the historical foundations of race, and community building. An additional 139 items—from a lecture on the history of racism in America to a handout on white supremacy culture and an article on the school-to-prison pipeline—are available to explore via the portal’s resources page.
In collaboration with the International Coalition of Sites of Conscience, the National Museum of the American Indian has created a toolkit that aims to “help people facilitate new conversations with and among students about the power of images and words, the challenges of memory, and the relationship between personal and national value,” says museum director Kevin Gover in a statement. The Smithsonian Asian Pacific American Center offers a similarly focused resource called “Standing Together Against Xenophobia.” As the site’s description notes, “This includes addressing not only the hatred and violence that has recently targeted people of Asian descent, but also the xenophobia that plagues our society during times of national crisis.”
Ahead of NMAAHC’s official opening in 2016, the museum hosted a series of public programs titled “History, Rebellion, and Reconciliation.” Panels included “Ferguson: What Does This Moment Mean for America?” and “#Words Matter: Making Revolution Irresistible.” As Smithsonian reported at the time, “It was somewhat of a refrain at the symposium that museums can provide ‘safe,’ or even ‘sacred’ spaces, within which visitors [can] wrestle with difficult and complex topics.” Then-director Lonnie Bunch expanded on this mindset in an interview, telling Smithsonian, “Our job is to be an educational institution that uses history and culture not only to look back, not only to help us understand today, but to point us towards what we can become.” For more context on the museum’s collections, mission and place in American history, visit Smithsonian’s “Breaking Ground” hub and NMAAHC’s digital resources guide.
Historical examples of allyship offer both inspiration and cautionary tales for the present. Take, for example, Albert Einstein, who famously criticized segregation as a “disease of white people” and continually used his platform to denounce racism. (The scientist’s advocacy is admittedly complicated by travel diaries that reveal his deeply troubling views on race.)
Einstein’s near-contemporary, a white novelist named John Howard Griffin, took his supposed allyship one step further, darkening his skin and embarking on a “human odyssey through the South,” as Bruce Watson wrote in 2011. Griffin’s chronicle of his experience, a volume titled Black Like Me, became a surprise bestseller, refuting “the idea that minorities were acting out of paranoia,” according to scholar Gerald Early, and testifying to the veracity of black people’s accounts of racism.
“The only way I could see to bridge the gap between us,” wrote Griffin in Black Like Me, “was to become a Negro.”
Griffin, however, had the privilege of being able to shed his blackness at will—which he did after just one month of donning his makeup. By that point, Watson observed, Griffin could simply “stand no more.”
Sixty years later, what is perhaps most striking is just how little has changed. As Bunch reflected earlier this week, “The state of our democracy feels fragile and precarious.”
Addressing the racism and social inequity embedded in American society will be a “monumental task,” the secretary added. But “the past is replete with examples of ordinary people working together to overcome seemingly insurmountable challenges. History is a guide to a better future and demonstrates that we can become a better society—but only if we collectively demand it from each other and from the institutions responsible for administering justice.”
Editor’s Note, July 24, 2020: This article previously stated that some 3.9 million of the 10.7 million people who survived the harrowing two-month journey across the Middle Passage between 1525 and 1866 were ultimately enslaved in the United States. In fact, the 3.9 million figure refers to the number of enslaved individuals in the U.S. just before the Civil War. We regret the error.
| |
57ea1490715a862e3676bdfd4cf2be39 | https://www.smithsonianmag.com/history/158-resources-understanding-systemic-racism-america-180975029/?fbclid=IwAR2XUzLNCesVla8R4wIeXsQOPOeFLWS3RgzH4boAyHtaQpHk1Ux8MErxa7U | In a short essay published earlier this week, Smithsonian Secretary Lonnie G. Bunch wrote that the recent killing in Minnesota of George Floyd has forced the country to “confront the reality that, despite gains made in the past 50 years, we are still a nation riven by inequality and racial division.”
Amid escalating clashes between protesters and police, discussing race—from the inequity embedded in American institutions to the United States’ long, painful history of anti-black violence—is an essential step in sparking meaningful societal change. To support those struggling to begin these difficult conversations, the Smithsonian’s National Museum of African American History and Culture recently launched a “Talking About Race” portal featuring “tools and guidance” for educators, parents, caregivers and other people committed to equity.
“Talking About Race” joins a vast trove of resources from the Smithsonian Institution dedicated to understanding what Bunch describes as America’s “tortured racial past.” From Smithsonian magazine articles on slavery’s Trail of Tears and the disturbing resilience of scientific racism to the National Museum of American History’s collection of Black History Month resources for educators and a Sidedoor podcast on the Tulsa Race Massacre, these 158 resources are designed to foster an equal society, encourage commitment to unbiased choices and promote antiracism in all aspects of life. Listings are bolded and organized by category.
1. Historical Context
2. Systemic Inequality
3. Anti-Black Violence
4. Protest
5. Intersectionality
6. Allyship and Education
Between 1525 and 1866, 12.5 million people were kidnapped from Africa and sent to the Americas through the transatlantic slave trade. Only 10.7 million survived the harrowing two month journey. Comprehending the sheer scale of this forced migration—and slavery’s subsequent spread across the country via interregional trade—can be a daunting task, but as historian Leslie Harris told Smithsonian’s Amy Crawford earlier this year, framing “these big concepts in terms of individual lives … can [help you] better understand what these things mean.”
Take, for instance, the story of John Casor. Originally an indentured servant of African descent, Casor lost a 1654 or 1655 court case convened to determine whether his contract had lapsed. He became the first individual declared a slave for life in the United States. Manuel Vidau, a Yoruba man who was captured and sold to traders some 200 years after Casor’s enslavement, later shared an account of his life with the British and Foreign Anti-Slavery Society, which documented his remarkable story—after a decade of enslavement in Cuba, he purchased a share in a lottery ticket and won enough money to buy his freedom—in records now available on the digital database “Freedom Narratives.” (A separate, similarly document-based online resource emphasizes individuals described in fugitive slave ads, which historian Joshua Rothman describes as “sort of a little biography” providing insights on their subjects’ appearance and attire.)
Finally, consider the life of Matilda McCrear, the last known survivor of the transatlantic slave trade. Kidnapped from West Africa and brought to the U.S. on the Clotilda, she arrived in Mobile, Alabama, in July 1860—more than 50 years after Congress had outlawed the import of enslaved labor. McCrear, who died in 1940 at the age of 81 or 82, “displayed a determined, even defiant streak” in her later life, wrote Brigit Katz earlier this year. She refused to use her former owner’s last name, wore her hair in traditional Yoruba style and had a decades-long relationship with a white German man.
How American society remembers and teaches the horrors of slavery is crucial. But as recent studies have shown, many textbooks offer a sanitized view of this history, focusing solely on “positive” stories about black leaders like Harriet Tubman and Frederick Douglass. Prior to 2018, Texas schools even taught that states’ rights and sectionalism—not slavery—were the main causes of the Civil War. And, in Confederate memorials across the country, writes historian Kevin M. Levin, enslaved individuals are often falsely portrayed as loyal slaves.
Accurately representing slavery might require an updated vocabulary, argued historian Michael Landis in 2015: Outdated “[t]erms like ‘compromise’ or ‘plantation’ served either to reassure worried Americans in a Cold War world, or uphold a white supremacist, sexist interpretation of the past.” Rather than referring to the Compromise of 1850, call it the Appeasement of 1850—a term that better describes “the uneven nature of the agreement,” according to Landis. Smithsonian scholar Christopher Wilson wrote, too, that widespread framing of the Civil War as a battle between equal entities lends legitimacy to the Confederacy, which was not a nation in its own right, but an “illegitimate rebellion and unrecognized political entity.” A 2018 Smithsonian magazine investigation found that the literal costs of the Confederacy are immense: In the decade prior, American taxpayers contributed $40 million to the maintenance of Confederate monuments and heritage organizations.
To better understand the immense brutality ingrained in enslaved individuals’ everyday lives, read up on Louisiana’s Whitney Plantation Museum, which acts as “part reminder of the scars of institutional bondage, part mausoleum for dozens of enslaved people who worked (and died) in [its] sugar fields, … [and] monument to the terror of slavery,” as Jared Keller observed in 2016. Visitors begin their tour in a historic church populated by clay sculptures of children who died on the plantation’s grounds, then move on to a series of granite slabs engraved with hundreds of enslaved African Americans’ names. Scattered throughout the experience are stories of the violence inflicted by overseers.
The Whitney Plantation Museum is at the forefront of a vanguard of historical sites working to confront their racist pasts. In recent years, exhibitions, oral history projects and other initiatives have highlighted the enslaved people whose labor powered such landmarks as Mount Vernon, the White House and Monticello. At the same time, historians are increasingly calling attention to major historical figures’ own slave-holding legacies: From Thomas Jefferson to George Washington, William Clark of Lewis and Clark, Francis Scott Key, and other Founding Fathers, many American icons were complicit in upholding the institution of slavery. Washington, Jefferson, James Madison and Aaron Burr, among others, sexually abused enslaved females working in their households and had oft-overlooked biracial families.
Though Abraham Lincoln issued the Emancipation Proclamation on January 1, 1863, the decree took two-and-a-half years to fully enact. June 19, 1865—the day Union Gen. Gordon Granger informed the enslaved individuals of Galveston, Texas, that they were officially free—is now known as Juneteenth: America’s “second independence day,” according to NMAAHC. Initially celebrated mainly in Texas, Juneteenth spread across the country as African Americans fled the South in what is now called the Great Migration.
At the onset of that mass movement in 1916, 90 percent of African Americans still lived in the South, where they were “held captive by the virtual slavery of sharecropping and debt peonage and isolated from the rest of the country,” as Isabel Wilkerson wrote in 2016. (Sharecropping, a system in which formerly enslaved people became tenant farmers and lived in “converted” slave cabins, was the impetus for the 1919 Elaine Massacre, which found white soldiers collaborating with local vigilantes to kill at least 200 sharecroppers who dared to criticize their low wages.) By the time the Great Migration—famously chronicled by artist Jacob Lawrence—ended in the 1970s, 47 percent of African Americans called the northern and western United States home.
The third season of Sidedoor explored a South Carolina residence’s unique journey from slave cabin to family home and its latest incarnation as a centerpiece at the National Museum of African American History and Culture.
Conditions outside the Deep South were more favorable than those within the region, but the “hostility and hierarchies that fed the Southern caste system” remained major obstacles for black migrants in all areas of the country, according to Wilkerson. Low-paying jobs, redlining, restrictive housing covenants and rampant discrimination limited opportunities, creating inequality that would eventually give rise to the civil rights movement.
“The Great Migration was the first big step that the nation’s servant class ever took without asking,” Wilkerson explained. “ … It was about agency for a people who had been denied it, who had geography as the only tool at their disposal. It was an expression of faith, despite the terrors they had survived, that the country whose wealth had been created by their ancestors’ unpaid labor might do right by them.”
Racial, economic and educational disparities are deeply entrenched in U.S. institutions. Though the Declaration of Independence states that “all men are created equal,” American democracy has historically—and often violently—excluded certain groups. “Democracy means everybody can participate, it means you are sharing power with people you don’t know, don’t understand, might not even like,” said National Museum of American History curator Harry Rubenstein in 2017. “That’s the bargain. And some people over time have felt very threatened by that notion.”
Instances of inequality range from the obvious to less overtly discriminatory policies and belief systems. Historical examples of the former include poll taxes that effectively disenfranchised African American voters; the marginalization of African American soldiers who fought in World War I and World War II but were treated like second-class citizens at home; black innovators who were barred from filing patents for their inventions; white medical professionals’ exploitation of black women’s bodies (see Henrietta Lacks and J. Marion Sims); Richard and Mildred Loving’s decade-long fight to legalize interracial marriage; the segregated nature of travel in the Jim Crow era; the government-mandated segregation of American cities; and segregation in schools.
Among the most heartbreaking examples of structural racism’s subtle effects are accounts shared by black children. In the late 1970s, when Lebert F. Lester II was 8 or 9 years old, he started building a sand castle during a trip to the Connecticut shore. A young white girl joined him but was quickly taken away by her father. Lester recalled the girl returning, only to ask him, “Why don’t [you] just go in the water and wash it off?” Lester says., “I was so confused—I only figured out later she meant my complexion.” Two decades earlier, in 1957, 15-year-old Minnijean Brown had arrived at Little Rock Central High School with high hopes of “making friends, going to dances and singing in the chorus.” Instead, she and the rest of the Little Rock Nine—a group of black students selected to attend the formerly all-white academy after Brown v. Board of Education desegregated public schools—were subjected to daily verbal and physical assaults. Around the same time, photographer John G. Zimmerman captured snapshots of racial politics in the South that included comparisons of black families waiting in long lines for polio inoculations as white children received speedy treatment.
In 1968, the Kerner Commission, a group convened by President Lyndon Johnson, found that white racism, not black anger, was the impetus for the widespread civil unrest sweeping the nation. As Alice George wrote in 2018, the commission’s report suggested that “[b]ad policing practices, a flawed justice system, unscrupulous consumer credit practices, poor or inadequate housing, high unemployment, voter suppression and other culturally embedded forms of racial discrimination all converged to propel violent upheaval.” Few listened to the findings, let alone its suggestion of aggressive government spending aimed at leveling the playing field. Instead, the country embraced a different cause: space travel. The day after the 1969 moon landing, the leading black paper the New York Amsterdam News ran a story stating, “Yesterday, the moon. Tomorrow, maybe us.”
Fifty years after the Kerner Report’s release, a separate study assessed how much had changed; it concluded that conditions had actually worsened. In 2017, black unemployment was higher than in 1968, as was the rate of incarcerated individuals who were black. The wealth gap had also increased substantially, with the median white family having ten times more wealth than the median black family. “We are resegregating our cities and our schools, condemning millions of kids to inferior education and taking away their real possibility of getting out of poverty,” said Fred Harris, the last surviving member of the Kerner Commission, following the 2018 study’s release.
Today, scientific racism—grounded in such faulty practices as eugenics and the treatment of race “as a crude proxy for myriad social and environmental factors,” writes Ramin Skibba—persists despite overwhelming evidence that race has only social, not biological, meaning. Black scholars including Mamie Phipps Clark, a psychologist whose research on racial identity in children helped end segregation in schools, and Rebecca J. Cole, a 19th-century physician and advocate who challenged the idea that black communities were destined for death and disease, have helped overturn some of these biases. But a 2015 survey found that 48 percent of black and Latina women scientists, respectively, still report being mistaken for custodial or administrative staff. Even artificial intelligence exhibits racial biases, many of which are introduced by lab staff and crowdsourced workers who program their own conscious and unconscious opinions into algorithms.
In addition to enduring centuries of enslavement, exploitation and inequality, African Americans have long been the targets of racially charged physical violence. Per the Alabama-based Equal Justice Initiative, more than 4,400 lynchings—mob killings undertaken without legal authority—took place in the U.S. between the end of Reconstruction and World War II.
Incredibly, the Senate only passed legislation declaring lynching a federal crime in 2018. Between 1918 and the Justice for Victims of Lynching Act’s eventual passage, more than 200 anti-lynching bills failed to make it through Congress. (Earlier this week, Sen. Rand Paul said he would hold up a separate, similarly intentioned bill over fears that its definition of lynching was too broad. The House passed the bill in a 410-to-4 vote this February.) Also in 2018, the Equal Justice Initiative opened the nation’s first monument to African American lynching victims. The six-acre memorial site stands alongside a museum dedicated to tracing the nation’s history of racial bias and persecution from slavery to the present.
One of the earliest instances of Reconstruction-era racial violence took place in Opelousas, Louisiana, in September 1868. Two months ahead of the presidential election, Southern white Democrats started terrorizing Republican opponents who appeared poised to secure victory at the polls. On September 28, a group of men attacked 18-year-old schoolteacher Emerson Bentley, who had already attracted ire for teaching African American students, after he published an account of local Democrats’ intimidation of Republicans. Bentley escaped with his life, but 27 of the 29 African Americans who arrived on the scene to help him were summarily executed. Over the next two weeks, vigilante terror led to the deaths of some 250 people, the majority of whom were black.
In April 1873, another spate of violence rocked Louisiana. The Colfax Massacre, described by historian Eric Foner as the “bloodiest single instance of racial carnage in the Reconstruction era,” unfolded under similar circumstances as Opelousas, with tensions between Democrats and Republicans culminating in the deaths of between 60 and 150 African Americans, as well as three white men.
Between the turn of the 20th century and the 1920s, multiple massacres broke out in response to false allegations that young black men had raped or otherwise assaulted white women. In August 1908, a mob terrorized African American neighborhoods across Springfield, Illinois, vandalizing black-owned businesses, setting fire to the homes of black residents, beating those unable to flee and lynching at least two people. Local authorities, argues historian Roberta Senechal, were “ineffectual at best, complicit at worst.”
False accusations also sparked a July 1919 race riot in Washington, D.C. and the Tulsa Race Massacre of 1921, which was most recently dramatized in the HBO series “Watchmen.” As African American History Museum curator Paul Gardullo tells Smithsonian, tensions related to Tulsa’s economy underpinned the violence: Forced to settle on what was thought to be worthless land, African Americans and Native Americans struck oil and proceeded to transform the Greenwood neighborhood of Tulsa into a prosperous community known as “Black Wall Street.” According to Gardullo, “It was the frustration of poor whites not knowing what to do with a successful black community, and in coalition with the city government [they] were given permission to do what they did.”
Over the course of two days in spring 1921, the Tulsa Race Massacre claimed the lives of an estimated 300 black Tulsans and displaced another 10,000. Mobs burned down at least 1,256 residences, churches, schools and businesses and destroyed almost 40 blocks of Greenwood. As the Sidedoor episode “Confronting the Past” notes, “No one knows how many people died, no one was ever convicted, and no one really talked about it nearly a century later.”
The second season of Sidedoor told the story of the Tulsa Race Massacre of 1921.
Economic injustice also led to the East St. Louis Race War of 1917. This labor dispute-turned-deadly found “people’s houses being set ablaze, … people being shot when they tried to flee, some trying to swim to the other side of the Mississippi while being shot at by white mobs with rifles, others being dragged out of street cars and beaten and hanged from street lamps,” recalled Dhati Kennedy, the son of a survivor who witnessed the devastation firsthand. Official counts place the death toll at 39 black and 9 white individuals, but locals argue that the real toll was closer to 100.
A watershed moment for the burgeoning civil rights movement was the 1955 murder of 14-year-old Emmett Till. Accused of whistling at a white woman while visiting family members in Mississippi, he was kidnapped, tortured and killed. Emmett’s mother, Mamie Till Mobley, decided to give her son an open-casket funeral, forcing the world to confront the image of his disfigured, decomposing body. (Visuals, including photographs, movies, television clips and artwork, played a key role in advancing the movement.) The two white men responsible for Till’s murder were acquitted by an all-white jury. A marker at the site where the teenager’s body was recovered has been vandalized at least three times since its placement in 2007.
The form of anti-black violence with the most striking parallels to contemporary conversations is police brutality. As Katie Nodjimbadem reported in 2017, a regional crime survey of late 1920s Chicago and Cook County, Illinois, found that while African Americans constituted just 5 percent of the area’s population, they made up 30 percent of the victims of police killings. Civil rights protests exacerbated tensions between African Americans and police, with events like the Orangeburg Massacre of 1968, in which law enforcement officers shot and killed three student activists at South Carolina State College, and the Glenville shootout, which left three police officers, three black nationalists and one civilian dead, fostering mistrust between the two groups.
Today, this legacy is exemplified by broken windows policing, a controversial approach that encourages racial profiling and targets African American and Latino communities. “What we see is a continuation of an unequal relationship that has been exacerbated, made worse if you will, by the militarization and the increase in fire power of police forces around the country,” William Pretzer, senior curator at NMAAHC, told Smithsonian in 2017.
The history of protest and revolt in the United States is inextricably linked with the racial violence detailed above.
Prior to the Civil War, enslaved individuals rarely revolted outright. Nat Turner, whose 1831 insurrection ended in his execution, was one of the rare exceptions. A fervent Christian, he drew inspiration from the Bible. His personal copy, now housed in the collections of the African American History Museum, represented the “possibility of something else for himself and for those around him,” curator Mary Ellis told Smithsonian’s Victoria Dawson in 2016.
Other enslaved African Americans practiced less risky forms of resistance, including working slowly, breaking tools and setting objects on fire. “Slave rebellions, though few and small in size in America, were invariably bloody,” wrote Dawson. “Indeed, death was all but certain.”
One of the few successful uprisings of the period was the Creole Rebellion. In the fall of 1841, 128 enslaved African Americans traveling aboard The Creole mutinied against its crew, forcing their former captors to sail the brig to the British West Indies, where slavery was abolished and they could gain immediate freedom.
An April 1712 revolt found enslaved New Yorkers setting fire to white-owned buildings and firing on slaveholders. Quickly outnumbered, the group fled but was tracked to a nearby swamp; though several members were spared, the majority were publicly executed, and in the years following the uprising, the city enacted laws limiting enslaved individuals’ already scant freedom. In 1811, meanwhile, more than 500 African Americans marched on New Orleans while chanting “Freedom or Death.” Though the German Coast uprising was brutally suppressed, historian Daniel Rasmussen argues that it “had been much larger—and come much closer to succeeding—than the planters and American officials let on.”
Some 150 years after what Rasmussen deems America’s “largest slave revolt,” the civil rights movement ushered in a different kind of protest. In 1955, police arrested Rosa Parks for refusing to yield her bus seat to a white passenger (“I had been pushed around all my life and felt at this moment that I couldn’t take it any more,” she later wrote). The ensuing Montgomery bus boycott, in which black passengers refused to ride public transit until officials met their demands, led the Supreme Court to rule segregated buses unconstitutional. Five years later, the Greensboro Four similarly took a stand, ironically by staging a sit-in at a Woolworth’s lunch counter. As Christopher Wilson wrote ahead of the 60th anniversary of the event, “What made Greensboro different [from other sit-ins] was how it grew from a courageous moment to a revolutionary movement.”
During the 1950s and ’60s, civil rights leaders adopted varying approaches to protest: Malcolm X, a staunch proponent of black nationalism who called for equality by “any means necessary,” “made tangible the anger and frustration of African Americans who were simply catching hell,” according to journalist Allison Keyes. He repeated the same argument “over and over again,” wrote academic and activist Cornel West in 2015: “What do you think you would do after 400 years of slavery and Jim Crow and lynching? Do you think you would respond nonviolently? What’s your history like? Let’s look at how you have responded when you were oppressed. George Washington—revolutionary guerrilla fighter!’”
Martin Luther King Jr. famously advocated for nonviolent protest, albeit not in the form that many think. As biographer Taylor Branch told Smithsonian in 2015, King’s understanding of nonviolence was more complex than is commonly argued. Unlike Mahatma Gandhi’s “passive resistance,” King believed resistance “depended on being active, using demonstrations, direct actions, to ‘amplify the message’ of the protest they were making,” according to Ron Rosenbaum. In the activist’s own words, “[A] riot is the language of the unheard. And what is it America has failed to hear?… It has failed to hear that the promises of freedom and justice have not been met. ”
Another key player in the civil rights movement, the militant Black Panther Party, celebrated black power and operated under a philosophy of “demands and aspirations.” The group’s Ten-Point Program called for an “immediate end to POLICE BRUTALITY and MURDER of Black people,” as well as more controversial measures like freeing all black prisoners and exempting black men from military service. Per NMAAHC, black power “emphasized black self-reliance and self-determination more than integration,” calling for the creation of separate African American political and cultural organizations. In doing so, the movement ensured that its proponents would attract the unwelcome attention of the FBI and other government agencies.
Many of the protests now viewed as emblematic of the fight for racial justice took place in the 1960s. On August 28, 1963, more than 250,000 people gathered in D.C. for the March on Washington for Jobs and Freedom. Ahead of the 50th anniversary of the march, activists who attended the event detailed the experience for a Smithsonian oral history: Entertainer Harry Belafonte observed, “We had to seize the opportunity and make our voices heard. Make those who are comfortable with our oppression—make them uncomfortable—Dr. King said that was the purpose of this mission,” while Representative John Lewis recalled, “Looking toward Union Station, we saw a sea of humanity; hundreds, thousands of people. … People literally pushed us, carried us all the way, until we reached the Washington Monument and then we walked on to the Lincoln Memorial..”
Two years after the March on Washington, King and other activists organized a march from Selma, Alabama, to the state capital of Montgomery. Later called the Selma March, the protest was dramatized in a 2014 film starring David Oyelowo as MLK. (Reflecting on Selma, Smithsonian Secretary Lonnie Bunch, then-director of NMAAHC, deemed it a “remarkable film” that “does not privilege the white perspective … [or] use the movement as a convenient backdrop for a conventional story.”)
Organized in response to the manifest obstacles black individuals faced when attempting to vote, the Selma March actually consisted of three separate protests. The first of these, held on March 7, 1965, ended in a tragedy now known as Bloody Sunday. As peaceful protesters gathered on the Edmund Pettus Bridge—named for a Confederate general and local Ku Klux Klan leader—law enforcement officers attacked them with tear gas and clubs. One week later, President Lyndon B. Johnson offered the Selma protesters his support and introduced legislation aimed at expanding voting rights. During the third and final march, organized in the aftermath of Johnson’s announcement, tens of thousands of protesters (protected by the National Guard and personally led by King) converged on Montgomery. Along the way, interior designer Carl Benkert used a hidden reel-to-reel tape recorder to document the sounds—and specifically songs—of the event.
The protests of the early and mid-1960s culminated in the widespread unrest of 1967 and 1968. For five days in July 1967, riots on a scale unseen since 1863 rocked the city of Detroit: As Lorraine Boissoneault writes, “Looters prowled the streets, arsonists set buildings on fire, civilian snipers took position from rooftops and police shot and arrested citizens indiscriminately.” Systemic injustice in such areas as housing, jobs and education contributed to the uprising, but police brutality was the driving factor behind the violence. By the end of the riots, 43 people were dead. Hundreds sustained injuries, and more than 7,000 were arrested.
The Detroit riots of 1967 prefaced the seismic changes of 1968. As Matthew Twombly wrote in 2018, movements including the Vietnam War, the Cold War, civil rights, human rights and youth culture “exploded with force in 1968,” triggering aftershocks that would resonate both in America and abroad for decades to come.
On February 1, black sanitation workers Echol Cole and Robert Walker died in a gruesome accident involving a malfunctioning garbage truck. Their deaths, compounded by Mayor Henry Loeb’s refusal to negotiate with labor representatives, led to the outbreak of the Memphis sanitation workers’ strike—an event remembered both “as an example of powerless African Americans standing up for themselves” and as the backdrop to King’s April 4 assassination.
Though King is lionized today, he was highly unpopular at the time of his death. According to a Harris Poll conducted in early 1968, nearly 75 percent of Americans disapproved of the civil rights leader, who had become increasingly vocal in his criticism of the Vietnam War and economic inequity. Despite the public’s seeming ambivalence toward King—and his family’s calls for nonviolence—his murder sparked violent protests across the country. In all, the Holy Week Uprisings spread to nearly 200 cities, leaving 3,500 people injured and 43 dead. Roughly 27,000 protesters were arrested, and 54 of the cities involved sustained more than $100,000 in property damage.
In May, thousands flocked to Washington, D.C. for a protest King had planned prior to his death. Called the Poor People’s Campaign, the event united racial groups from all quarters of America in a call for economic justice. Attendees constructed “Resurrection City,” a temporary settlement made up of 3,000 wooden tents, and camped out on the National Mall for 42 days.
“While we were all in a kind of depressed state about the assassinations of King and RFK, we were trying to keep our spirits up, and keep focused on King’s ideals of humanitarian issues, the elimination of poverty and freedom,” protester Lenneal Henderson told Smithsonian in 2018. “It was exciting to be part of something that potentially, at least, could make a difference in the lives of so many people who were in poverty around the country.”
Racial unrest persisted throughout the year, with uprisings on the Fourth of July, a protest at the Summer Olympic Games, and massacres at Orangeburg and Glenville testifying to the tumultuous state of the nation.
The Black Lives Matter marches organized in response to the killings of George Floyd, Philando Castile, Freddie Gray, Eric Garner, Sandra Bland, Trayvon Martin, Michael Brown and other victims of anti-black violence share many parallels with protests of the past.
Football player Colin Kaepernick’s decision to kneel during the national anthem—and the unmitigated outrage it sparked—bears similarities to the story of boxer Muhammad Ali, historian Jonathan Eig told Smithsonian in 2017: “It’s been eerie to watch it, that we’re still having these debates that black athletes should be expected to shut their mouths and perform for us,” he said. “That’s what people told Ali 50 years ago.”
Other aspects of modern protest draw directly on uprisings of earlier eras. In 2016, for instance, artist Dread Scott updated an anti-lynching poster used by the National Association for the Advancement of Colored People (NAACP) in the 1920s and ’30s to read “A Black Man Was Lynched by Police Yesterday.” (Scott added the words “by police.”)
Though the civil rights movement is often viewed as the result of a cohesive “grand plan” or “manifestation of the vision of the few leaders whose names we know,” the American History Museum’s Christopher Wilson argues that “the truth is there wasn’t one, there were many and they were often competitive.”
Meaningful change required a whirlwind of revolution, adds Wilson, “but also the slow legal march. It took boycotts, petitions, news coverage, civil disobedience, marches, lawsuits, shrewd political maneuvering, fundraising, and even the violent terror campaign of the movement’s opponents—all going on [at] the same time.”
In layman’s terms, intersectionality refers to the multifaceted discrimination experienced by individuals who belong to multiple minority groups. As theorist Kimberlé Crenshaw explains in a video published by NMAAHC, these classifications run the gamut from race to gender, gender identity, class, sexuality and disability. A black woman who identifies as a lesbian, for instance, may face prejudice based on her race, gender or sexuality.
Crenshaw, who coined the term intersectionality in 1989, explains the concept best: “Consider an intersection made up of many roads,” she says in the video. “The roads are the structures of race, gender, gender identity, class, sexuality, disability. And the traffic running through those roads are the practices and policies that discriminate against people. Now if an accident happens, it can be caused by cars traveling in any number of directions, and sometimes, from all of them. So if a black woman is harmed because she is in an intersection, her injury could result from discrimination from any or all directions.”
Understanding intersectionality is essential for teasing out the relationships between movements including civil rights, LGBTQ rights, suffrage and feminism. Consider the contributions of black transgender activists Marsha P. Johnson and Sylvia Rivera, who played pivotal roles in the Stonewall Uprising; gay civil rights leader Bayard Rustin, who was only posthumously pardoned this year for having consensual sex with men; the “rank and file” women of the Black Panther Party; and African American suffragists such as Mary Church Terrell and Nannie Helen Burroughs.
All of these individuals fought discrimination on multiple levels: As noted in “Votes for Women: A Portrait of Persistence,” a 2019 exhibition at the National Portrait Gallery, leading suffrage organizations initially excluded black suffragists from their ranks, driving the emergence of separate suffrage movements and, eventually, black feminists grounded in the inseparable experiences of racism, sexism and classism.
Individuals striving to become better allies by educating themselves and taking decisive action have an array of options for getting started. Begin with NMAAHC’s “Talking About Race” portal, which features sections on being antiracist, whiteness, bias, social identities and systems of oppression, self-care, race and racial identity, the historical foundations of race, and community building. An additional 139 items—from a lecture on the history of racism in America to a handout on white supremacy culture and an article on the school-to-prison pipeline—are available to explore via the portal’s resources page.
In collaboration with the International Coalition of Sites of Conscience, the National Museum of the American Indian has created a toolkit that aims to “help people facilitate new conversations with and among students about the power of images and words, the challenges of memory, and the relationship between personal and national value,” says museum director Kevin Gover in a statement. The Smithsonian Asian Pacific American Center offers a similarly focused resource called “Standing Together Against Xenophobia.” As the site’s description notes, “This includes addressing not only the hatred and violence that has recently targeted people of Asian descent, but also the xenophobia that plagues our society during times of national crisis.”
Ahead of NMAAHC’s official opening in 2016, the museum hosted a series of public programs titled “History, Rebellion, and Reconciliation.” Panels included “Ferguson: What Does This Moment Mean for America?” and “#Words Matter: Making Revolution Irresistible.” As Smithsonian reported at the time, “It was somewhat of a refrain at the symposium that museums can provide ‘safe,’ or even ‘sacred’ spaces, within which visitors [can] wrestle with difficult and complex topics.” Then-director Lonnie Bunch expanded on this mindset in an interview, telling Smithsonian, “Our job is to be an educational institution that uses history and culture not only to look back, not only to help us understand today, but to point us towards what we can become.” For more context on the museum’s collections, mission and place in American history, visit Smithsonian’s “Breaking Ground” hub and NMAAHC’s digital resources guide.
Historical examples of allyship offer both inspiration and cautionary tales for the present. Take, for example, Albert Einstein, who famously criticized segregation as a “disease of white people” and continually used his platform to denounce racism. (The scientist’s advocacy is admittedly complicated by travel diaries that reveal his deeply troubling views on race.)
Einstein’s near-contemporary, a white novelist named John Howard Griffin, took his supposed allyship one step further, darkening his skin and embarking on a “human odyssey through the South,” as Bruce Watson wrote in 2011. Griffin’s chronicle of his experience, a volume titled Black Like Me, became a surprise bestseller, refuting “the idea that minorities were acting out of paranoia,” according to scholar Gerald Early, and testifying to the veracity of black people’s accounts of racism.
“The only way I could see to bridge the gap between us,” wrote Griffin in Black Like Me, “was to become a Negro.”
Griffin, however, had the privilege of being able to shed his blackness at will—which he did after just one month of donning his makeup. By that point, Watson observed, Griffin could simply “stand no more.”
Sixty years later, what is perhaps most striking is just how little has changed. As Bunch reflected earlier this week, “The state of our democracy feels fragile and precarious.”
Addressing the racism and social inequity embedded in American society will be a “monumental task,” the secretary added. But “the past is replete with examples of ordinary people working together to overcome seemingly insurmountable challenges. History is a guide to a better future and demonstrates that we can become a better society—but only if we collectively demand it from each other and from the institutions responsible for administering justice.”
Editor’s Note, July 24, 2020: This article previously stated that some 3.9 million of the 10.7 million people who survived the harrowing two-month journey across the Middle Passage between 1525 and 1866 were ultimately enslaved in the United States. In fact, the 3.9 million figure refers to the number of enslaved individuals in the U.S. just before the Civil War. We regret the error.
| |
d32cb3361574693bdfe2f2a9cd7436b5 | https://www.smithsonianmag.com/history/158-resources-understanding-systemic-racism-america-180975029/?fbclid=IwAR3VpOTAydTwRE639Ouqq4PQ6qkhRF2qQegIEvG7p-_867KxRGesY67XGks | In a short essay published earlier this week, Smithsonian Secretary Lonnie G. Bunch wrote that the recent killing in Minnesota of George Floyd has forced the country to “confront the reality that, despite gains made in the past 50 years, we are still a nation riven by inequality and racial division.”
Amid escalating clashes between protesters and police, discussing race—from the inequity embedded in American institutions to the United States’ long, painful history of anti-black violence—is an essential step in sparking meaningful societal change. To support those struggling to begin these difficult conversations, the Smithsonian’s National Museum of African American History and Culture recently launched a “Talking About Race” portal featuring “tools and guidance” for educators, parents, caregivers and other people committed to equity.
“Talking About Race” joins a vast trove of resources from the Smithsonian Institution dedicated to understanding what Bunch describes as America’s “tortured racial past.” From Smithsonian magazine articles on slavery’s Trail of Tears and the disturbing resilience of scientific racism to the National Museum of American History’s collection of Black History Month resources for educators and a Sidedoor podcast on the Tulsa Race Massacre, these 158 resources are designed to foster an equal society, encourage commitment to unbiased choices and promote antiracism in all aspects of life. Listings are bolded and organized by category.
1. Historical Context
2. Systemic Inequality
3. Anti-Black Violence
4. Protest
5. Intersectionality
6. Allyship and Education
Between 1525 and 1866, 12.5 million people were kidnapped from Africa and sent to the Americas through the transatlantic slave trade. Only 10.7 million survived the harrowing two month journey. Comprehending the sheer scale of this forced migration—and slavery’s subsequent spread across the country via interregional trade—can be a daunting task, but as historian Leslie Harris told Smithsonian’s Amy Crawford earlier this year, framing “these big concepts in terms of individual lives … can [help you] better understand what these things mean.”
Take, for instance, the story of John Casor. Originally an indentured servant of African descent, Casor lost a 1654 or 1655 court case convened to determine whether his contract had lapsed. He became the first individual declared a slave for life in the United States. Manuel Vidau, a Yoruba man who was captured and sold to traders some 200 years after Casor’s enslavement, later shared an account of his life with the British and Foreign Anti-Slavery Society, which documented his remarkable story—after a decade of enslavement in Cuba, he purchased a share in a lottery ticket and won enough money to buy his freedom—in records now available on the digital database “Freedom Narratives.” (A separate, similarly document-based online resource emphasizes individuals described in fugitive slave ads, which historian Joshua Rothman describes as “sort of a little biography” providing insights on their subjects’ appearance and attire.)
Finally, consider the life of Matilda McCrear, the last known survivor of the transatlantic slave trade. Kidnapped from West Africa and brought to the U.S. on the Clotilda, she arrived in Mobile, Alabama, in July 1860—more than 50 years after Congress had outlawed the import of enslaved labor. McCrear, who died in 1940 at the age of 81 or 82, “displayed a determined, even defiant streak” in her later life, wrote Brigit Katz earlier this year. She refused to use her former owner’s last name, wore her hair in traditional Yoruba style and had a decades-long relationship with a white German man.
How American society remembers and teaches the horrors of slavery is crucial. But as recent studies have shown, many textbooks offer a sanitized view of this history, focusing solely on “positive” stories about black leaders like Harriet Tubman and Frederick Douglass. Prior to 2018, Texas schools even taught that states’ rights and sectionalism—not slavery—were the main causes of the Civil War. And, in Confederate memorials across the country, writes historian Kevin M. Levin, enslaved individuals are often falsely portrayed as loyal slaves.
Accurately representing slavery might require an updated vocabulary, argued historian Michael Landis in 2015: Outdated “[t]erms like ‘compromise’ or ‘plantation’ served either to reassure worried Americans in a Cold War world, or uphold a white supremacist, sexist interpretation of the past.” Rather than referring to the Compromise of 1850, call it the Appeasement of 1850—a term that better describes “the uneven nature of the agreement,” according to Landis. Smithsonian scholar Christopher Wilson wrote, too, that widespread framing of the Civil War as a battle between equal entities lends legitimacy to the Confederacy, which was not a nation in its own right, but an “illegitimate rebellion and unrecognized political entity.” A 2018 Smithsonian magazine investigation found that the literal costs of the Confederacy are immense: In the decade prior, American taxpayers contributed $40 million to the maintenance of Confederate monuments and heritage organizations.
To better understand the immense brutality ingrained in enslaved individuals’ everyday lives, read up on Louisiana’s Whitney Plantation Museum, which acts as “part reminder of the scars of institutional bondage, part mausoleum for dozens of enslaved people who worked (and died) in [its] sugar fields, … [and] monument to the terror of slavery,” as Jared Keller observed in 2016. Visitors begin their tour in a historic church populated by clay sculptures of children who died on the plantation’s grounds, then move on to a series of granite slabs engraved with hundreds of enslaved African Americans’ names. Scattered throughout the experience are stories of the violence inflicted by overseers.
The Whitney Plantation Museum is at the forefront of a vanguard of historical sites working to confront their racist pasts. In recent years, exhibitions, oral history projects and other initiatives have highlighted the enslaved people whose labor powered such landmarks as Mount Vernon, the White House and Monticello. At the same time, historians are increasingly calling attention to major historical figures’ own slave-holding legacies: From Thomas Jefferson to George Washington, William Clark of Lewis and Clark, Francis Scott Key, and other Founding Fathers, many American icons were complicit in upholding the institution of slavery. Washington, Jefferson, James Madison and Aaron Burr, among others, sexually abused enslaved females working in their households and had oft-overlooked biracial families.
Though Abraham Lincoln issued the Emancipation Proclamation on January 1, 1863, the decree took two-and-a-half years to fully enact. June 19, 1865—the day Union Gen. Gordon Granger informed the enslaved individuals of Galveston, Texas, that they were officially free—is now known as Juneteenth: America’s “second independence day,” according to NMAAHC. Initially celebrated mainly in Texas, Juneteenth spread across the country as African Americans fled the South in what is now called the Great Migration.
At the onset of that mass movement in 1916, 90 percent of African Americans still lived in the South, where they were “held captive by the virtual slavery of sharecropping and debt peonage and isolated from the rest of the country,” as Isabel Wilkerson wrote in 2016. (Sharecropping, a system in which formerly enslaved people became tenant farmers and lived in “converted” slave cabins, was the impetus for the 1919 Elaine Massacre, which found white soldiers collaborating with local vigilantes to kill at least 200 sharecroppers who dared to criticize their low wages.) By the time the Great Migration—famously chronicled by artist Jacob Lawrence—ended in the 1970s, 47 percent of African Americans called the northern and western United States home.
The third season of Sidedoor explored a South Carolina residence’s unique journey from slave cabin to family home and its latest incarnation as a centerpiece at the National Museum of African American History and Culture.
Conditions outside the Deep South were more favorable than those within the region, but the “hostility and hierarchies that fed the Southern caste system” remained major obstacles for black migrants in all areas of the country, according to Wilkerson. Low-paying jobs, redlining, restrictive housing covenants and rampant discrimination limited opportunities, creating inequality that would eventually give rise to the civil rights movement.
“The Great Migration was the first big step that the nation’s servant class ever took without asking,” Wilkerson explained. “ … It was about agency for a people who had been denied it, who had geography as the only tool at their disposal. It was an expression of faith, despite the terrors they had survived, that the country whose wealth had been created by their ancestors’ unpaid labor might do right by them.”
Racial, economic and educational disparities are deeply entrenched in U.S. institutions. Though the Declaration of Independence states that “all men are created equal,” American democracy has historically—and often violently—excluded certain groups. “Democracy means everybody can participate, it means you are sharing power with people you don’t know, don’t understand, might not even like,” said National Museum of American History curator Harry Rubenstein in 2017. “That’s the bargain. And some people over time have felt very threatened by that notion.”
Instances of inequality range from the obvious to less overtly discriminatory policies and belief systems. Historical examples of the former include poll taxes that effectively disenfranchised African American voters; the marginalization of African American soldiers who fought in World War I and World War II but were treated like second-class citizens at home; black innovators who were barred from filing patents for their inventions; white medical professionals’ exploitation of black women’s bodies (see Henrietta Lacks and J. Marion Sims); Richard and Mildred Loving’s decade-long fight to legalize interracial marriage; the segregated nature of travel in the Jim Crow era; the government-mandated segregation of American cities; and segregation in schools.
Among the most heartbreaking examples of structural racism’s subtle effects are accounts shared by black children. In the late 1970s, when Lebert F. Lester II was 8 or 9 years old, he started building a sand castle during a trip to the Connecticut shore. A young white girl joined him but was quickly taken away by her father. Lester recalled the girl returning, only to ask him, “Why don’t [you] just go in the water and wash it off?” Lester says., “I was so confused—I only figured out later she meant my complexion.” Two decades earlier, in 1957, 15-year-old Minnijean Brown had arrived at Little Rock Central High School with high hopes of “making friends, going to dances and singing in the chorus.” Instead, she and the rest of the Little Rock Nine—a group of black students selected to attend the formerly all-white academy after Brown v. Board of Education desegregated public schools—were subjected to daily verbal and physical assaults. Around the same time, photographer John G. Zimmerman captured snapshots of racial politics in the South that included comparisons of black families waiting in long lines for polio inoculations as white children received speedy treatment.
In 1968, the Kerner Commission, a group convened by President Lyndon Johnson, found that white racism, not black anger, was the impetus for the widespread civil unrest sweeping the nation. As Alice George wrote in 2018, the commission’s report suggested that “[b]ad policing practices, a flawed justice system, unscrupulous consumer credit practices, poor or inadequate housing, high unemployment, voter suppression and other culturally embedded forms of racial discrimination all converged to propel violent upheaval.” Few listened to the findings, let alone its suggestion of aggressive government spending aimed at leveling the playing field. Instead, the country embraced a different cause: space travel. The day after the 1969 moon landing, the leading black paper the New York Amsterdam News ran a story stating, “Yesterday, the moon. Tomorrow, maybe us.”
Fifty years after the Kerner Report’s release, a separate study assessed how much had changed; it concluded that conditions had actually worsened. In 2017, black unemployment was higher than in 1968, as was the rate of incarcerated individuals who were black. The wealth gap had also increased substantially, with the median white family having ten times more wealth than the median black family. “We are resegregating our cities and our schools, condemning millions of kids to inferior education and taking away their real possibility of getting out of poverty,” said Fred Harris, the last surviving member of the Kerner Commission, following the 2018 study’s release.
Today, scientific racism—grounded in such faulty practices as eugenics and the treatment of race “as a crude proxy for myriad social and environmental factors,” writes Ramin Skibba—persists despite overwhelming evidence that race has only social, not biological, meaning. Black scholars including Mamie Phipps Clark, a psychologist whose research on racial identity in children helped end segregation in schools, and Rebecca J. Cole, a 19th-century physician and advocate who challenged the idea that black communities were destined for death and disease, have helped overturn some of these biases. But a 2015 survey found that 48 percent of black and Latina women scientists, respectively, still report being mistaken for custodial or administrative staff. Even artificial intelligence exhibits racial biases, many of which are introduced by lab staff and crowdsourced workers who program their own conscious and unconscious opinions into algorithms.
In addition to enduring centuries of enslavement, exploitation and inequality, African Americans have long been the targets of racially charged physical violence. Per the Alabama-based Equal Justice Initiative, more than 4,400 lynchings—mob killings undertaken without legal authority—took place in the U.S. between the end of Reconstruction and World War II.
Incredibly, the Senate only passed legislation declaring lynching a federal crime in 2018. Between 1918 and the Justice for Victims of Lynching Act’s eventual passage, more than 200 anti-lynching bills failed to make it through Congress. (Earlier this week, Sen. Rand Paul said he would hold up a separate, similarly intentioned bill over fears that its definition of lynching was too broad. The House passed the bill in a 410-to-4 vote this February.) Also in 2018, the Equal Justice Initiative opened the nation’s first monument to African American lynching victims. The six-acre memorial site stands alongside a museum dedicated to tracing the nation’s history of racial bias and persecution from slavery to the present.
One of the earliest instances of Reconstruction-era racial violence took place in Opelousas, Louisiana, in September 1868. Two months ahead of the presidential election, Southern white Democrats started terrorizing Republican opponents who appeared poised to secure victory at the polls. On September 28, a group of men attacked 18-year-old schoolteacher Emerson Bentley, who had already attracted ire for teaching African American students, after he published an account of local Democrats’ intimidation of Republicans. Bentley escaped with his life, but 27 of the 29 African Americans who arrived on the scene to help him were summarily executed. Over the next two weeks, vigilante terror led to the deaths of some 250 people, the majority of whom were black.
In April 1873, another spate of violence rocked Louisiana. The Colfax Massacre, described by historian Eric Foner as the “bloodiest single instance of racial carnage in the Reconstruction era,” unfolded under similar circumstances as Opelousas, with tensions between Democrats and Republicans culminating in the deaths of between 60 and 150 African Americans, as well as three white men.
Between the turn of the 20th century and the 1920s, multiple massacres broke out in response to false allegations that young black men had raped or otherwise assaulted white women. In August 1908, a mob terrorized African American neighborhoods across Springfield, Illinois, vandalizing black-owned businesses, setting fire to the homes of black residents, beating those unable to flee and lynching at least two people. Local authorities, argues historian Roberta Senechal, were “ineffectual at best, complicit at worst.”
False accusations also sparked a July 1919 race riot in Washington, D.C. and the Tulsa Race Massacre of 1921, which was most recently dramatized in the HBO series “Watchmen.” As African American History Museum curator Paul Gardullo tells Smithsonian, tensions related to Tulsa’s economy underpinned the violence: Forced to settle on what was thought to be worthless land, African Americans and Native Americans struck oil and proceeded to transform the Greenwood neighborhood of Tulsa into a prosperous community known as “Black Wall Street.” According to Gardullo, “It was the frustration of poor whites not knowing what to do with a successful black community, and in coalition with the city government [they] were given permission to do what they did.”
Over the course of two days in spring 1921, the Tulsa Race Massacre claimed the lives of an estimated 300 black Tulsans and displaced another 10,000. Mobs burned down at least 1,256 residences, churches, schools and businesses and destroyed almost 40 blocks of Greenwood. As the Sidedoor episode “Confronting the Past” notes, “No one knows how many people died, no one was ever convicted, and no one really talked about it nearly a century later.”
The second season of Sidedoor told the story of the Tulsa Race Massacre of 1921.
Economic injustice also led to the East St. Louis Race War of 1917. This labor dispute-turned-deadly found “people’s houses being set ablaze, … people being shot when they tried to flee, some trying to swim to the other side of the Mississippi while being shot at by white mobs with rifles, others being dragged out of street cars and beaten and hanged from street lamps,” recalled Dhati Kennedy, the son of a survivor who witnessed the devastation firsthand. Official counts place the death toll at 39 black and 9 white individuals, but locals argue that the real toll was closer to 100.
A watershed moment for the burgeoning civil rights movement was the 1955 murder of 14-year-old Emmett Till. Accused of whistling at a white woman while visiting family members in Mississippi, he was kidnapped, tortured and killed. Emmett’s mother, Mamie Till Mobley, decided to give her son an open-casket funeral, forcing the world to confront the image of his disfigured, decomposing body. (Visuals, including photographs, movies, television clips and artwork, played a key role in advancing the movement.) The two white men responsible for Till’s murder were acquitted by an all-white jury. A marker at the site where the teenager’s body was recovered has been vandalized at least three times since its placement in 2007.
The form of anti-black violence with the most striking parallels to contemporary conversations is police brutality. As Katie Nodjimbadem reported in 2017, a regional crime survey of late 1920s Chicago and Cook County, Illinois, found that while African Americans constituted just 5 percent of the area’s population, they made up 30 percent of the victims of police killings. Civil rights protests exacerbated tensions between African Americans and police, with events like the Orangeburg Massacre of 1968, in which law enforcement officers shot and killed three student activists at South Carolina State College, and the Glenville shootout, which left three police officers, three black nationalists and one civilian dead, fostering mistrust between the two groups.
Today, this legacy is exemplified by broken windows policing, a controversial approach that encourages racial profiling and targets African American and Latino communities. “What we see is a continuation of an unequal relationship that has been exacerbated, made worse if you will, by the militarization and the increase in fire power of police forces around the country,” William Pretzer, senior curator at NMAAHC, told Smithsonian in 2017.
The history of protest and revolt in the United States is inextricably linked with the racial violence detailed above.
Prior to the Civil War, enslaved individuals rarely revolted outright. Nat Turner, whose 1831 insurrection ended in his execution, was one of the rare exceptions. A fervent Christian, he drew inspiration from the Bible. His personal copy, now housed in the collections of the African American History Museum, represented the “possibility of something else for himself and for those around him,” curator Mary Ellis told Smithsonian’s Victoria Dawson in 2016.
Other enslaved African Americans practiced less risky forms of resistance, including working slowly, breaking tools and setting objects on fire. “Slave rebellions, though few and small in size in America, were invariably bloody,” wrote Dawson. “Indeed, death was all but certain.”
One of the few successful uprisings of the period was the Creole Rebellion. In the fall of 1841, 128 enslaved African Americans traveling aboard The Creole mutinied against its crew, forcing their former captors to sail the brig to the British West Indies, where slavery was abolished and they could gain immediate freedom.
An April 1712 revolt found enslaved New Yorkers setting fire to white-owned buildings and firing on slaveholders. Quickly outnumbered, the group fled but was tracked to a nearby swamp; though several members were spared, the majority were publicly executed, and in the years following the uprising, the city enacted laws limiting enslaved individuals’ already scant freedom. In 1811, meanwhile, more than 500 African Americans marched on New Orleans while chanting “Freedom or Death.” Though the German Coast uprising was brutally suppressed, historian Daniel Rasmussen argues that it “had been much larger—and come much closer to succeeding—than the planters and American officials let on.”
Some 150 years after what Rasmussen deems America’s “largest slave revolt,” the civil rights movement ushered in a different kind of protest. In 1955, police arrested Rosa Parks for refusing to yield her bus seat to a white passenger (“I had been pushed around all my life and felt at this moment that I couldn’t take it any more,” she later wrote). The ensuing Montgomery bus boycott, in which black passengers refused to ride public transit until officials met their demands, led the Supreme Court to rule segregated buses unconstitutional. Five years later, the Greensboro Four similarly took a stand, ironically by staging a sit-in at a Woolworth’s lunch counter. As Christopher Wilson wrote ahead of the 60th anniversary of the event, “What made Greensboro different [from other sit-ins] was how it grew from a courageous moment to a revolutionary movement.”
During the 1950s and ’60s, civil rights leaders adopted varying approaches to protest: Malcolm X, a staunch proponent of black nationalism who called for equality by “any means necessary,” “made tangible the anger and frustration of African Americans who were simply catching hell,” according to journalist Allison Keyes. He repeated the same argument “over and over again,” wrote academic and activist Cornel West in 2015: “What do you think you would do after 400 years of slavery and Jim Crow and lynching? Do you think you would respond nonviolently? What’s your history like? Let’s look at how you have responded when you were oppressed. George Washington—revolutionary guerrilla fighter!’”
Martin Luther King Jr. famously advocated for nonviolent protest, albeit not in the form that many think. As biographer Taylor Branch told Smithsonian in 2015, King’s understanding of nonviolence was more complex than is commonly argued. Unlike Mahatma Gandhi’s “passive resistance,” King believed resistance “depended on being active, using demonstrations, direct actions, to ‘amplify the message’ of the protest they were making,” according to Ron Rosenbaum. In the activist’s own words, “[A] riot is the language of the unheard. And what is it America has failed to hear?… It has failed to hear that the promises of freedom and justice have not been met. ”
Another key player in the civil rights movement, the militant Black Panther Party, celebrated black power and operated under a philosophy of “demands and aspirations.” The group’s Ten-Point Program called for an “immediate end to POLICE BRUTALITY and MURDER of Black people,” as well as more controversial measures like freeing all black prisoners and exempting black men from military service. Per NMAAHC, black power “emphasized black self-reliance and self-determination more than integration,” calling for the creation of separate African American political and cultural organizations. In doing so, the movement ensured that its proponents would attract the unwelcome attention of the FBI and other government agencies.
Many of the protests now viewed as emblematic of the fight for racial justice took place in the 1960s. On August 28, 1963, more than 250,000 people gathered in D.C. for the March on Washington for Jobs and Freedom. Ahead of the 50th anniversary of the march, activists who attended the event detailed the experience for a Smithsonian oral history: Entertainer Harry Belafonte observed, “We had to seize the opportunity and make our voices heard. Make those who are comfortable with our oppression—make them uncomfortable—Dr. King said that was the purpose of this mission,” while Representative John Lewis recalled, “Looking toward Union Station, we saw a sea of humanity; hundreds, thousands of people. … People literally pushed us, carried us all the way, until we reached the Washington Monument and then we walked on to the Lincoln Memorial..”
Two years after the March on Washington, King and other activists organized a march from Selma, Alabama, to the state capital of Montgomery. Later called the Selma March, the protest was dramatized in a 2014 film starring David Oyelowo as MLK. (Reflecting on Selma, Smithsonian Secretary Lonnie Bunch, then-director of NMAAHC, deemed it a “remarkable film” that “does not privilege the white perspective … [or] use the movement as a convenient backdrop for a conventional story.”)
Organized in response to the manifest obstacles black individuals faced when attempting to vote, the Selma March actually consisted of three separate protests. The first of these, held on March 7, 1965, ended in a tragedy now known as Bloody Sunday. As peaceful protesters gathered on the Edmund Pettus Bridge—named for a Confederate general and local Ku Klux Klan leader—law enforcement officers attacked them with tear gas and clubs. One week later, President Lyndon B. Johnson offered the Selma protesters his support and introduced legislation aimed at expanding voting rights. During the third and final march, organized in the aftermath of Johnson’s announcement, tens of thousands of protesters (protected by the National Guard and personally led by King) converged on Montgomery. Along the way, interior designer Carl Benkert used a hidden reel-to-reel tape recorder to document the sounds—and specifically songs—of the event.
The protests of the early and mid-1960s culminated in the widespread unrest of 1967 and 1968. For five days in July 1967, riots on a scale unseen since 1863 rocked the city of Detroit: As Lorraine Boissoneault writes, “Looters prowled the streets, arsonists set buildings on fire, civilian snipers took position from rooftops and police shot and arrested citizens indiscriminately.” Systemic injustice in such areas as housing, jobs and education contributed to the uprising, but police brutality was the driving factor behind the violence. By the end of the riots, 43 people were dead. Hundreds sustained injuries, and more than 7,000 were arrested.
The Detroit riots of 1967 prefaced the seismic changes of 1968. As Matthew Twombly wrote in 2018, movements including the Vietnam War, the Cold War, civil rights, human rights and youth culture “exploded with force in 1968,” triggering aftershocks that would resonate both in America and abroad for decades to come.
On February 1, black sanitation workers Echol Cole and Robert Walker died in a gruesome accident involving a malfunctioning garbage truck. Their deaths, compounded by Mayor Henry Loeb’s refusal to negotiate with labor representatives, led to the outbreak of the Memphis sanitation workers’ strike—an event remembered both “as an example of powerless African Americans standing up for themselves” and as the backdrop to King’s April 4 assassination.
Though King is lionized today, he was highly unpopular at the time of his death. According to a Harris Poll conducted in early 1968, nearly 75 percent of Americans disapproved of the civil rights leader, who had become increasingly vocal in his criticism of the Vietnam War and economic inequity. Despite the public’s seeming ambivalence toward King—and his family’s calls for nonviolence—his murder sparked violent protests across the country. In all, the Holy Week Uprisings spread to nearly 200 cities, leaving 3,500 people injured and 43 dead. Roughly 27,000 protesters were arrested, and 54 of the cities involved sustained more than $100,000 in property damage.
In May, thousands flocked to Washington, D.C. for a protest King had planned prior to his death. Called the Poor People’s Campaign, the event united racial groups from all quarters of America in a call for economic justice. Attendees constructed “Resurrection City,” a temporary settlement made up of 3,000 wooden tents, and camped out on the National Mall for 42 days.
“While we were all in a kind of depressed state about the assassinations of King and RFK, we were trying to keep our spirits up, and keep focused on King’s ideals of humanitarian issues, the elimination of poverty and freedom,” protester Lenneal Henderson told Smithsonian in 2018. “It was exciting to be part of something that potentially, at least, could make a difference in the lives of so many people who were in poverty around the country.”
Racial unrest persisted throughout the year, with uprisings on the Fourth of July, a protest at the Summer Olympic Games, and massacres at Orangeburg and Glenville testifying to the tumultuous state of the nation.
The Black Lives Matter marches organized in response to the killings of George Floyd, Philando Castile, Freddie Gray, Eric Garner, Sandra Bland, Trayvon Martin, Michael Brown and other victims of anti-black violence share many parallels with protests of the past.
Football player Colin Kaepernick’s decision to kneel during the national anthem—and the unmitigated outrage it sparked—bears similarities to the story of boxer Muhammad Ali, historian Jonathan Eig told Smithsonian in 2017: “It’s been eerie to watch it, that we’re still having these debates that black athletes should be expected to shut their mouths and perform for us,” he said. “That’s what people told Ali 50 years ago.”
Other aspects of modern protest draw directly on uprisings of earlier eras. In 2016, for instance, artist Dread Scott updated an anti-lynching poster used by the National Association for the Advancement of Colored People (NAACP) in the 1920s and ’30s to read “A Black Man Was Lynched by Police Yesterday.” (Scott added the words “by police.”)
Though the civil rights movement is often viewed as the result of a cohesive “grand plan” or “manifestation of the vision of the few leaders whose names we know,” the American History Museum’s Christopher Wilson argues that “the truth is there wasn’t one, there were many and they were often competitive.”
Meaningful change required a whirlwind of revolution, adds Wilson, “but also the slow legal march. It took boycotts, petitions, news coverage, civil disobedience, marches, lawsuits, shrewd political maneuvering, fundraising, and even the violent terror campaign of the movement’s opponents—all going on [at] the same time.”
In layman’s terms, intersectionality refers to the multifaceted discrimination experienced by individuals who belong to multiple minority groups. As theorist Kimberlé Crenshaw explains in a video published by NMAAHC, these classifications run the gamut from race to gender, gender identity, class, sexuality and disability. A black woman who identifies as a lesbian, for instance, may face prejudice based on her race, gender or sexuality.
Crenshaw, who coined the term intersectionality in 1989, explains the concept best: “Consider an intersection made up of many roads,” she says in the video. “The roads are the structures of race, gender, gender identity, class, sexuality, disability. And the traffic running through those roads are the practices and policies that discriminate against people. Now if an accident happens, it can be caused by cars traveling in any number of directions, and sometimes, from all of them. So if a black woman is harmed because she is in an intersection, her injury could result from discrimination from any or all directions.”
Understanding intersectionality is essential for teasing out the relationships between movements including civil rights, LGBTQ rights, suffrage and feminism. Consider the contributions of black transgender activists Marsha P. Johnson and Sylvia Rivera, who played pivotal roles in the Stonewall Uprising; gay civil rights leader Bayard Rustin, who was only posthumously pardoned this year for having consensual sex with men; the “rank and file” women of the Black Panther Party; and African American suffragists such as Mary Church Terrell and Nannie Helen Burroughs.
All of these individuals fought discrimination on multiple levels: As noted in “Votes for Women: A Portrait of Persistence,” a 2019 exhibition at the National Portrait Gallery, leading suffrage organizations initially excluded black suffragists from their ranks, driving the emergence of separate suffrage movements and, eventually, black feminists grounded in the inseparable experiences of racism, sexism and classism.
Individuals striving to become better allies by educating themselves and taking decisive action have an array of options for getting started. Begin with NMAAHC’s “Talking About Race” portal, which features sections on being antiracist, whiteness, bias, social identities and systems of oppression, self-care, race and racial identity, the historical foundations of race, and community building. An additional 139 items—from a lecture on the history of racism in America to a handout on white supremacy culture and an article on the school-to-prison pipeline—are available to explore via the portal’s resources page.
In collaboration with the International Coalition of Sites of Conscience, the National Museum of the American Indian has created a toolkit that aims to “help people facilitate new conversations with and among students about the power of images and words, the challenges of memory, and the relationship between personal and national value,” says museum director Kevin Gover in a statement. The Smithsonian Asian Pacific American Center offers a similarly focused resource called “Standing Together Against Xenophobia.” As the site’s description notes, “This includes addressing not only the hatred and violence that has recently targeted people of Asian descent, but also the xenophobia that plagues our society during times of national crisis.”
Ahead of NMAAHC’s official opening in 2016, the museum hosted a series of public programs titled “History, Rebellion, and Reconciliation.” Panels included “Ferguson: What Does This Moment Mean for America?” and “#Words Matter: Making Revolution Irresistible.” As Smithsonian reported at the time, “It was somewhat of a refrain at the symposium that museums can provide ‘safe,’ or even ‘sacred’ spaces, within which visitors [can] wrestle with difficult and complex topics.” Then-director Lonnie Bunch expanded on this mindset in an interview, telling Smithsonian, “Our job is to be an educational institution that uses history and culture not only to look back, not only to help us understand today, but to point us towards what we can become.” For more context on the museum’s collections, mission and place in American history, visit Smithsonian’s “Breaking Ground” hub and NMAAHC’s digital resources guide.
Historical examples of allyship offer both inspiration and cautionary tales for the present. Take, for example, Albert Einstein, who famously criticized segregation as a “disease of white people” and continually used his platform to denounce racism. (The scientist’s advocacy is admittedly complicated by travel diaries that reveal his deeply troubling views on race.)
Einstein’s near-contemporary, a white novelist named John Howard Griffin, took his supposed allyship one step further, darkening his skin and embarking on a “human odyssey through the South,” as Bruce Watson wrote in 2011. Griffin’s chronicle of his experience, a volume titled Black Like Me, became a surprise bestseller, refuting “the idea that minorities were acting out of paranoia,” according to scholar Gerald Early, and testifying to the veracity of black people’s accounts of racism.
“The only way I could see to bridge the gap between us,” wrote Griffin in Black Like Me, “was to become a Negro.”
Griffin, however, had the privilege of being able to shed his blackness at will—which he did after just one month of donning his makeup. By that point, Watson observed, Griffin could simply “stand no more.”
Sixty years later, what is perhaps most striking is just how little has changed. As Bunch reflected earlier this week, “The state of our democracy feels fragile and precarious.”
Addressing the racism and social inequity embedded in American society will be a “monumental task,” the secretary added. But “the past is replete with examples of ordinary people working together to overcome seemingly insurmountable challenges. History is a guide to a better future and demonstrates that we can become a better society—but only if we collectively demand it from each other and from the institutions responsible for administering justice.”
Editor’s Note, July 24, 2020: This article previously stated that some 3.9 million of the 10.7 million people who survived the harrowing two-month journey across the Middle Passage between 1525 and 1866 were ultimately enslaved in the United States. In fact, the 3.9 million figure refers to the number of enslaved individuals in the U.S. just before the Civil War. We regret the error.
| |
e94dd41d7dbb1bed86a2813614434aeb | https://www.smithsonianmag.com/history/17th-century-english-who-settled-southern-us-had-very-little-be-thankful-180953466/ | The 17th-Century English Who Settled in the Southern U.S. Had Very Little to be Thankful For | The 17th-Century English Who Settled in the Southern U.S. Had Very Little to be Thankful For
Do you have complicated feelings about Thanksgiving? Maybe your ancestors were among this continent’s indigenous peoples, and you have good reason to be rankled by thoughts of newly arrived English colonists feasting on Wamapanoag-procured venison, roasted wild turkey, and stores of indigenous corn. Or maybe Thanksgiving marks the beginning of a holiday season that brings with it the intricate emotional challenges of memory, home and family.
Why We Left: Untold Stories and Songs of America's First Immigrants
If you’re someone who feels a sense of angst, foreboding, or misery about this time of year, take heart: American history is on your side.
The truth of our history is that only a small minority of the early English immigrants to this country would have been celebrating as the New England Puritans did at the first Thanksgiving feast in 1621.
A thousand miles south, in Virginia and the Carolinas, the mood and the menu would have been drastically different—had there ever been a Thanksgiving there. Richard Frethorne, an indentured servant in the Virginia colony during the 1620s, wrote in a letter: “Since I came out of the ship, I never ate anything but peas, and loblollie (that is, water gruel).”
And don’t imagine for a second that those peas Frethorne was gobbling down were of the lovely, tender green garden variety dotted with butter. No, in the 1620s, Frethorne and his friends would have subsisted on a grey field pea resembling a lentil.
“As for deer or venison,” Frethorne wrote , “I never saw any since I came into this land. There is indeed some fowl, but we are not allowed to go and get it, but must work hard both early and late for a mess of water gruel and a mouthful of bread and beef.”
Frethorne’s letter is a rare surviving document reflecting the circumstances of the majority of English colonists who came to North America in the 17th century. The New England Puritans, after all, comprised only 15 to 20 percent of early English colonial migration.
Not only did the majority of English colonial migrants eat worse than the Puritans, but also their prayers (had they said any) would have sounded decidedly less thankful.
“People cry out day and night,” Frethorne wrote, “Oh! That they were in England without their limbs—and would not care to lose any limb to be in England again, yea though they beg from door to door.”
English migrants in Virginia had good reason not to feel grateful. Most came unfree, pushed out of England by economic forces that privatized shared pastures and farmlands and pushed up the prices of basic necessities. By the 17th century, more than half of the English peasantry was landless. The price of food shot up 600 percent, and firewood by 1,500 percent.
Many peasants who were pushed off their homelands built makeshift settlements in the forests, earning reputations as criminals and thieves. Others moved to the cities, and when the cities proved no kinder, they signed contracts promising seven years of hard labor in exchange for the price of passage to the Americas, and were boarded onto boats.
A trip to Virginia cost Frethorne and others like him six months salary and took about 10 weeks. One quarter to one half of new arrivals to Virginia and the Carolinas died within one year due to diseases like dysentery, typhoid, and malaria. Others succumbed to the strain of hard labor in a new climate and a strange place—an adjustment process the English described as “seasoning.” Only 7 percent of indentures claimed the land that they had been promised.
Most of these common English migrants did not read or write, so vivid and revealing letters like Frethorne’s are rare. But in the research for my book Why We Left: Songs and Stories from America’s First Immigrants, I learned how English migrants viewed their situation through the songs they sang about the voyage across the Atlantic Ocean. Those songs survived hundreds of years by word of mouth before they were written down in the 20th century.
These were not songs of thankfulness—not by a long shot. They were ballads full of ghastly scenes of the rejection, betrayal, cruelty, murder, and environmental ruin that had driven them out of England -- and of the seductive but false promises that drew them to America. These 17th-century songs planted the seeds for a new American genre of murder and hard luck ballads that was later picked up and advanced by singers like Johnny Cash, whose ancestors, like mine, were among those early hard luck migrants from England to America.
So if you find yourself a little blue this holiday season, take your marshmallow-topped sweet potatoes with a liberal dose of the Man In Black, and reassure yourself that you are a part of a long, long American tradition.
Joanna Brooks is Associate Dean of Graduate and Research Affairs at San Diego State University and author of Why We Left: Untold Stories and Songs of America’s First Immigrants (Minnesota, 2013). She wrote this for Zocalo Public Square.
|
91e2d2546586fd99202f3739e76e41cc | https://www.smithsonianmag.com/history/1911-report-set-america-on-path-screening-out-undesirable-immigrants-180969636/ | A 1911 Report Set America On a Path of Screening Out ‘Undesirable’ Immigrants | A 1911 Report Set America On a Path of Screening Out ‘Undesirable’ Immigrants
The Dillingham Commission is today little known. But a century ago, it stood at the center of a transformation in immigration policy, exemplifying Americans’ simultaneous feelings of fascination and fear toward the millions of migrants who have made the United States their home.
In 1911, the Dillingham Commission produced perhaps the most extensive investigation of immigration in the history of the country, an exhaustive 41-volume study that demonstrated just how vital 19th-century and early-20th-century immigrants were to the U.S. economy. But the commission’s own recommendations, delivered in the context of a fierce backlash against migrants, set the foundation for the end of industrial-era immigration and a half-century of exclusionist policies.
Congress created the Commission in 1907 in an effort to find a compromise between proponents and opponents of immigration. During the previous several decades, pundits and lawmakers had debated the need to impose restrictions on immigration. Lawmakers enacted several polices intended to interdict those deemed to pose a specific danger, such as people afflicted with contagious diseases or involved in moral turpitude. One notable act excluded Chinese laborers, and another prohibited the entry of workers who had been hired overseas by U.S. companies.
But critics dismissed these provisions as insufficient, and instead sought laws to reduce the overall number of entrants and improve their quality, the latter of which meant attributes, like literacy, that were perceived to make it easier for newcomers to assimilate and contribute to the nation.
The literacy test, a requirement that most adult immigrants be able to read or write, became the preferred restriction. Supporters saw it as the best means of securing the “most desirable” migrants, while critics saw education as the product of opportunity, not character or potential. In 1907, when Congress could not agree on its propriety, it created the Dillingham Commission—named for its chairman, U.S. Sen. William P. Dillingham, a Vermont Republican.
Over the next three years, the nine-member commission—comprising three U.S. senators, three representatives, and three “experts” selected by President Theodore Roosevelt—fulfilled its charge by conducting a thorough and wide-ranging investigation of current and past immigration. Its multi-volume Reports is a treasure trove of information that remains profoundly useful to students of immigration today.
Most of the work centered on “Immigrants in Industry,” but other topics of inquiry included “Immigrants in Cities,” “Children of Immigrants in Schools,” and a study of changes in immigrant physiology, the last conducted by anthropologist Franz Boas. He and his associates took head, or cranium, measurements of immigrant children in schools and concluded that the U.S. environment was engendering positive changes in “bodily form.” The children had features less like their European counterparts and more like “American types.” Commissioner Jeremiah Jenks also prepared a controversial Dictionary of Races, in which he sought definitively to identify and characterize the world’s races—or ethnic groups.
But it was not these reports that made the most impact. The commission also produced a compendium to summarize its findings and make policy recommendations. The latter would have profound effects.
The commissioners based their recommendations on the principle of admitting immigrants of such “quantity and quality as not to make too difficult the process of assimilation.” This, they acknowledged, constituted a departure from America’s traditional welcome of “the oppressed of other lands.” A corollary called for basing admission standards on “the prosperity and economic well-being of our people.” This raised the question of which polices would produce the desired effect. The recommended literacy test, argued racial theorist Madison Grant, would exclude low-quality individuals lacking in social, physical, and mental capabilities and who added nothing of value to America’s moral or intellectual character. Others saw it excluding too many of the hard-working manual laborers who had forged America’s steel and built its railroads.
After intense debate, the commission recommended passage of the literacy test, calling it “the single most feasible” method of exclusion. Restrictionists viewed this as an endorsement of their cause and used the recommendation to secure the test’s eventual passage by Congress in 1917.
The Reports also mentioned several other possible means of restriction that could warrant future consideration. These included the “limitation of the number of each race arriving each year to a certain percentage of that race arriving during a given period of years.” At the time, “race” was often equated to the modern meaning of ethnicity and sometimes drew its terminology from nationality, such as references to the “German race.” But, Jews were considered a distinctive race, subsumed within various nation-states.
William W. Husband, the Commission’s chief administrator, thereafter developed a quota scheme based on the 1910 census. Admission of immigrants belonging to a particular nationality would be limited to 5 percent of their total as reported in the census. Congress reduced that percentage to 3 in its temporary quota measure, passed in 1921. The permanent measure, passed in 1924, lowered it to 2 percent and used the 1890 census as the benchmark. The changes were deliberately designed to exclude more southern and eastern Europeans, so-called new immigrants deemed “undesirable” by many contemporary Americans. Asians, deemed wholly “undesirable,” did not receive any quotas. (Intriguingly, the Quota Acts exempted immigrants from the Western Hemisphere.) These provisions would define American immigration policy until passage of the Immigration and Naturalization Act of 1965.
The experience and impact of the Dillingham Commission offers lasting lessons to a country that still argues about immigration. The chief one is that fear tends to override facts about immigration policy, even when facts are in abundance.
Throughout its inquiry, the commission’s investigators sought to maintain objectivity; to collect the facts, let them speak for themselves, and make recommendations absent of bias. Throughout the Reports, the commission described immigrants positively, including the vilified “new” arrivals. Even the verbiage immediately preceding the recommendation of the literacy test spoke of them positively.
Yet, a social climate of fear and bigotry hijacked the investigation, and the commissioners themselves, ignoring facts in their own reports, endorsed restriction, largely to exclude the most recent types of immigrants. Critics, to no avail, would argue that socioeconomic conditions did not warrant more extensive exclusion, based on the commission’s own standards for such action. But the commission’s identification of the literacy test as the most “feasible method” trumped any such assertions.
So, too, when William Husband drafted his initial quota proposal, he based it on much more generous terms than did the congressmen who approved the final version. He also included quotas that would admit people from Asian countries—but the final versions in the quota laws had none, as bigoted extremism carried the day. The United States would enforce an Asian Exclusion Zone until the 1950s, and then establish only minuscule Asian quotas.
|
de6bcd23d67fec8ba18515d3ae51423f | https://www.smithsonianmag.com/history/1919-murder-case-gave-americans-right-remain-silent-180968916/ | The Triple Homicide in D.C. That Laid the Groundwork for Americans’ Right to Remain Silent | The Triple Homicide in D.C. That Laid the Groundwork for Americans’ Right to Remain Silent
If you’ve ever watched an American television crime drama, you probably can recite a suspect’s rights along with the arresting officers. Those requirements—that prisoners must be informed that they may remain silent, and that they have the right to an attorney—are associated in the public mind with Ernesto Miranda, convicted in Arizona of kidnapping and rape in 1963.
But the “Miranda rights” routinely read to suspects as a result of the 1966 Supreme Court decision that overturned his conviction have their roots in a much earlier case: that of a young Chinese man accused of murdering three of his countrymen in Washington, D.C., in 1919.
The nation’s capital had never seen anything quite like it: a triple murder of foreign diplomats. The victims worked for the Chinese Educational Mission and were assassinated in the city’s tony Kalorama neighborhood. With no obvious motive or leads to go on, the Washington police were baffled. But once they zeroed in on a suspect, they marched into his Manhattan apartment, searched it without a warrant, and pressured him to return to Washington with them. There they held him incommunicado in a hotel room without formal arrest to browbeat him into a confession.
The young Chinese man, Ziang Sung Wan, a sometime student who had been seen at the death house on the day of the murders, was suffering from the aftereffects of the Spanish flu, and the police took advantage of his distress. He was questioned day and night, even when he was in severe pain and did not wish to speak. After nine days, he was brought back to the scene of the murder and subjected to harsh interrogation. Food and water were denied, as were bathroom breaks. Racial epithets were hurled. Finally, under extreme duress, he confessed and was immediately arrested.
At trial, Wan recanted his confession, which he claimed he had made only to stop the relentless grilling by the detectives. But the judge refused to exclude it, and he was convicted of first-degree murder, which carried the penalty of death by hanging. His attorneys made their objection to the confession the centerpiece of their appeal to a higher court. But the appellate court, citing an 1897 U.S. Supreme Court precedent, sustained the verdict, ruling that only promises or threats from the police would have given cause to exclude it.
When President Warren G. Harding refused to commute Wan’s sentence, his only hope lay with the Supreme Court, to which his attorneys immediately appealed. Under the leadership of Chief Justice William Howard Taft, the Court had been passive on civil liberties, if not hostile to them. So it was a surprise to many that it chose to consider the case.
As it happened, there was good reason to accept it. In the quarter-century since the 1897 ruling, the country had been embroiled in a robust national debate about the ethics and efficacy of what had come to be called the “third degree.” Creative detectives had come up with many methods of extracting confessions from unwilling suspects, some of which amounted to nothing short of torture. As techniques like quartering suspects in pitch-dark cells, turning up the heat to “sweat” confessions out of them, and even blowing red pepper or releasing red ants into their cells were exposed, the public reaction was strongly negative. The newspapers began decrying the practices as brutal and un-American.
At the same time, there was a fierce debate going on in the judiciary as to what kinds of interrogations and police conduct actually were prohibited under the law. All of this, on top of the staggering evidence that Wan’s confession had been coerced, provided ample justification for the Supreme Court to bring order to the chaos surrounding confessions.
After oral arguments were heard, the task of drafting the opinion fell to Justice Louis D. Brandeis. The Harvard-educated jurist—an unapologetic progressive and civil libertarian and a tireless fighter for social justice, freedom of speech, and the right to privacy—was the ideal choice. All the justices eventually united behind his ruling, the power and seminal nature of which can be found in its elegance and brevity. In throwing out Wan’s confession, the Court affirmed that the Fifth Amendment permitted only voluntary confessions to be admitted as evidence in federal proceedings and that voluntariness didn’t rest solely on whether a promise or threat had been made.
Wan was retried—twice, in fact—without his confession being admitted into evidence. But after two hung juries, both with majorities favoring acquittal, the Justice Department gave up prosecuting him. His case, however, lived on as a cause célèbre.
Two important challenges lay ahead before all of America’s accused could enjoy full protection under this new principle of law. First, because Wan had been tried in the District of Columbia, where the federal government was in charge of local affairs, the new standard applied only to cases before federal courts. The privileges promised to the accused in the Bill of Rights had not yet been determined to apply to the states and localities. This convoluted process, known as the “incorporation doctrine,” actually took decades. And second, the new standard lacked clarity. For all his eloquence, Brandeis hadn’t provided a satisfactory definition of what made a confession voluntary, or instructions as to what had to be done to ensure a confession was lawful.
As a result, the concept remained open to interpretation for decades, and as the Supreme Court heard case after case in which law enforcement ran roughshod over individual rights, and defendants—especially minorities—were mistreated between arrest and trial, it became palpably clear that in order to ensure voluntariness, police behavior would again have to be addressed explicitly. But this time the remedy would not involve outlawing nefarious police practices that might negate it so much as mandating constructive behavior that would ensureit.
In writing the opinion in the 1966 case of Miranda v. Arizona, Chief Justice Earl Warren quoted liberally from Ziang Sung Wan v. United States. And he mandated safeguards that were ultimately condensed into the summary statement familiar to most Americans today as Miranda rights. They serve to inform suspects in clear and unequivocal terms that they have a right to remain silent, that anything they say may be used against them in a court of law, that they have the right to counsel and that if they are unable to afford one, an attorney will be appointed for them.
**********
Scott D. Seligman is a writer, a historian, a genealogist, a retired corporate executive, and a career “China hand.” He is the author of The Third Degree: The Triple Murder that Shook Washington and Changed American Criminal Justice and several other nonfiction books.
This essay is part of What It Means to Be American, a project of the Smithsonian’s National Museum of American History and Arizona State University, produced by Zócalo Public Square.
|
ad1eca57cc91db6e4f686d0c4f47cb60 | https://www.smithsonianmag.com/history/1924-law-slammed-door-immigrants-and-politicians-who-pushed-it-back-open-180974910/ | The 1924 Law That Slammed the Door on Immigrants and the Politicians Who Pushed it Back Open | The 1924 Law That Slammed the Door on Immigrants and the Politicians Who Pushed it Back Open
“AMERICA OF THE MELTING POT COMES TO END,” the New York Times headline blared in late April 1924. The opinion piece that followed, penned by Senator David Reed of Pennsylvania, claimed recent immigrants from southern and Eastern European countries had failed to satisfactorily assimilate and championed his recently passed legislation to severely restrict immigration to the United States. He proudly proclaimed, “The racial composition of America at the present time thus is made permanent.”
The 1924 Johnson-Reed Act, which Congress had overwhelmingly passed just weeks before and which President Coolidge would sign into law the following month, marked the start of a dark chapter in the nation’s immigration history. It drastically cut the total number of immigrants allowed in each year and effectively cut off all immigration from Asia. It made permanent strict quotas—defined as “two percent of the total number of people of each nationality in the United States as of the 1890 national census”—in order to favor immigrants from northern and Western Europe and preserve the homogeneity of the nation. The new system also required immigrants to apply for and receive visas before arriving and established the U.S. Border Patrol.
The restrictions imposed by the law sparked a prolonged fight to reverse them, driven by politicians who decried the law’s xenophobia and by presidents who worried about the foreign policy consequences of such exclusions. In her new book, One Mighty and Irresistible Tide: The Epic Struggle Over American Immigration, 1924-1965, journalist Jia Lynn Yang, a deputy national editor at The New York Times, details the drive to implement and sustain the 1924 legislation and the intense campaign to reverse it, a battle that culminated in the Immigration and Naturalization Act of 1965. That law eliminated the quotas, increased the number of visas issued each year, prioritized immigration for skilled workers and instituted a policy of family unification.
Yang spoke with Smithsonian about the advocates who led the way, the forces they battled and the legacy of their fight.
The idea of the United States as a nation of immigrants is at the core of the American narrative. But in 1924, Congress instituted a system of ethnic quotas so stringent that it choked off large-scale immigration for decades, sharply curtailing arrivals from southern and eastern Europe and outright banning those from nearly all of Asia.
The 1924 Johnson-Reed Act marked a schism in the country’s immigration history. How did the nation get to that point?
Before the act, there were these smaller attempts to restrict immigration. The most important was the 1882 Chinese Exclusion Act, which was quite a bold law that singled out, for the first time, an ethnic group for restriction.
Starting in the 1880s you have this historic wave of immigrants coming from southern and Eastern Europe. Jews, Italians. Lawmakers are continually trying to kind of stem that wave, and really it’s not until 1924 that they truly succeed. Because everything else they've tried [such as literacy tests] either gets vetoed by a president or doesn't really work.
1924 is really a watershed moment. Once you add a whole visa process, once you add these strict quotas, you’re just in a whole different regime of immigration. The system really just changes forever, and it’s a moment when the country I think symbolically says, ‘We’re not going to do things like this anymore. You can’t just show up.’
How did the theory of eugenics play a role in the new immigration system?
It became very important, because people with a lot of social influence really embraced it. These are leading economists, leading scientists, people who are really kind of dictating intellectual American life at the time. And [eugenics was] completely mainstream and considered very cutting edge, and just very current. If people could figure out a way to make a better society through this science, people didn't question why that was necessary or why their methods would work. And these experts began to testify before Congress as they're looking at immigration.
One of the primary examples would be [prominent eugenicist] Harry Laughlin. He hasn't spent his whole life being trained as a scientist, but he gets very excited about eugenics, joins people who are really hardcore scientists, and gets involved in the political side. Lawmakers treat him as kind of an in-house expert, essentially. He’s writing up reports at their behest, and pointing out, if you do the laws this way, you will actually improve the American bloodstream, and that's why you should do this. [Eugenicists] are people who were already very nativist and wanted to restrict immigration. But once they get the sort of scientific backing, it really strengthens their arguments, and that's how they're able to push this dramatic bill through in the ’20s.
The 1924 act was met with resistance during its passage and efforts to overturn it started immediately. What were the law’s opponents up against?
I think this notion—it's still very powerful now—that America should have some kind of ethnic makeup is actually a very hard thing to argue against. Their defense is one that I think you still see today, which is, “We're not being racist. We just want to keep a level of ethnic homogeneity in our society…we can't introduce new elements too quickly, and this is how we protect the stability of our country.”
I would also add that if you look at the polling on immigration over time—Gallup, for instance, has looked at this question for many, many years now—you hardly ever see Americans clamoring for more immigrants.
In fact, the people who want to change [immigration policy] are often presidents who are dealing with the foreign policy [consequences of the 1924 law.] That’s one thing that really surprised me in my research, is how immigration was driven by foreign policy concerns. So there are presidents who don't want to insult other leaders by saying, “We don't want people from your country.”
But your mainstream American is really not thinking about loosening immigration laws as a giant priority. Even now, you can see that both Democrats and Republicans are pretty leery of making that kind of super pro-loosening immigration laws argument. I don't think it's ever that politically popular to do that.
What finally led to the overhaul of the nation’s immigration laws in the 1960s?
It’s kind of an amazing confluence of events. Right before President Kennedy died, he introduced a bill to abolish these ethnic origins quotas. The bill doesn't really go anywhere, just as every other effort hadn't gone anywhere in 40 years. As usual, there's just not a lot of interest in changing the immigration quotas.
But when he is killed, President Johnson looks at the unfinished business of Kennedy and [thinks], ‘Let's honor the memory of our late president. Let's really do right by his memory. Let's make this stuff work. We've got to pass it.’
LBJ is leading the country in mourning, yes, but he also spots an extraordinary political opportunity to pass legislation, I think, that would otherwise never pass. The Civil Rights Act, Voting Rights Act, these are all kind of in that moment. But the immigration bill, too, has that kind of moral momentum from Kennedy’s death. You've got people talking about racial equality. We're going to be getting rid of Jim Crow laws, so we should also look at our immigration laws in the same way. They have a similar kind of racial and discriminatory problem to them.
At the same time you’ve got the Cold War argument—that these laws are embarrassing to us. They're not helping us win an ideological war against the Soviet Union. The other thing too is labor unions were anti-immigrant before. This is a moment where they actually flip sides. Once labor unions switch to the other side, that removes one of the big political opponents to changing the quotas.
Kennedy supported immigration reform and Johnson signed the 1965 act into law, but this wasn’t a consuming passion for either president. Who fought the legislation into being?
Emanuel “Manny” Celler was chair of the House Judiciary Committee for many, many years. Right when he becomes a Congressman, in 1923, he sees the quotas passed and is horrified, because he himself is from a German Jewish family and he represents a district in Brooklyn that is basically all immigrants from Europe. He basically spends the next 40 years trying to get rid of [the quotas]. He sees during World War II how [the quotas] make it impossible to admit Jewish refugees. After the war, he's still fighting and fighting and fighting, constantly losing. He’s sort of the rare person who in is there to see the victory, but not everybody does.
I’m thinking of Herbert Lehman. He is from the famous Lehman Brothers’ family, and comes from a huge amount of money from New York. He was the first Jewish governor of New York, and he was kind of a righthand man to FDR. He spends much of his senate career in the '50s fighting [for immigration reform] and loses again and again, just like Celler and others, because of the Red Scare and a lot of anti-communist sentiment, which translates into anti-immigrant sentiment on the Hill.
Celebrating “America as a nation of immigrants” is a surprisingly recent idea. How did that idea develop and play into the 1965 legislation?
The story of Kennedy’s Nation of Immigrants [a book published posthumously in 1964.] is sort of instructive with this. He is leaning on, and borrowing from, the work of immigration historian Oscar Handlin, who wrote this book called The Uprooted, which won a Pulitzer Prize in the early 1950s and was, at one point, assigned to a lot of schoolchildren to read. It was basically the seminal text that, for the first time that anyone could point to, celebrated all these immigrants who had come to this country and sort of pointed out the successive waves of people.
We often think of nationalism and immigration as opposing ideas and forces. The really interesting political turn in the '50s is to bring immigrants into this idea of American nationalism. It’s not that immigrants make America less special. It's that immigrants are what make America special.
Whereas in the '20s the argument was, “Keep America ‘American’ by keeping out immigrants.” Now it was, “If you're not going to welcome immigrants, you're not going to celebrate all these different waves of immigration, the Jews, the Italians, the Germans, you're just being un-American. You don't love this part of the American story.”
That is still a very powerful idea on the Left, in the Democratic Party. But I was really surprised in the research just how recent that is. That was a work of history. A historian had to put his finger on it. Then it had to then be translated into the political sphere to take on its own momentum, to become its own argument for immigrants.
What did advocates for the 1965 act expect when the law was signed? What has it looked like in reality?
The system they come up with is still really interesting to think about because it's very much the one we have today. They get rid of the quotas, and they prioritize family reunification. The people who get top priority for visas are people who already have family in the U.S. This is what the Trump administration wants to end. Just to give you a sense of just how little [the lawmakers] predicted what would happen: [reunification] was actually a compromise to nativists who wanted to keep America white.
Yet because of family reunification, once you do get enough people here who are outside Europe, their numbers actually grew and grew and grew and grew. A bunch of presidents kept adding these special carve-outs for different refugee populations, like the Cubans and Vietnamese.
Over time, the entire stream of immigrants just becomes much, much less European, much less white. To the point that now, I think we take for granted that a lot of our immigrants are from the Middle East, Africa, Asia, Latin America.
That is not something that I think almost anyone who was involved in the debate would have expected. In fact, they kept downplaying how much the law would change the actual demographics of the U.S. What's interesting to me is that no one quite knew what standing for the principle [of racial equality] would lead to in terms of what this country looked like.
How is what passed in 1965 tied to today’s immigration crisis?
At the end of this whole journey in 1965, [advocates] have to make a bunch of compromises and they added a numerical cap for the very first time on immigration from the Western hemisphere. So until that point—incredible to imagine right now because we are so fixated on securing the border—there was no numerical cap to how many people could come from Latin America and Canada. It was just totally open. That was, again, a foreign policy decision. It was an idea that you had to be friendly to your neighbors.
[The cap introduces] the idea of “illegal” immigrants from Mexico on this mass scale that didn't exist before. That just changed the nature of how we thought about Mexican immigrants forever, and which we are still living in the shadow of.
The law is lauded as a civil rights achievement by some, in that it basically bans racial discrimination in immigration laws and gets rid of these old ethnic quotas. But it really transforms our whole notion of our neighbors and our relationship to them as sources of immigration.
What were you most surprised to discover while researching and writing your book?
I got into this whole project for very personal reasons. I wanted to understand why my family had been allowed to come to this country [from Taiwan and China]. In retrospect, I feel kind of naïve for not having thought about it before. I so bought into this idea of America as a nation of immigrants that I hadn't even really seriously considered a possibility that my parents would have been rejected.
What was surprising to me was just to learn how easily that could have happened—and not just for me and my family but every family I know in America, basically, that's not from Europe. I now wonder, who among us would just not be here if not for the 1965 Immigration Nationality Act? And I think [it was surprising] understanding how hard that fight was to get it, how many times it didn't work, how many times it failed, how when it finally worked it was only because of this perfect convergence of all these different circumstances, literally from a president's assassination to somebody negotiating at the end, ‘We'll reunify families because that'll keep America more white,’ and then getting it wrong.
What is it like to release your book as the COVID-19 outbreak has led to a spike in Anti-Asian sentiment and a resurgence of xenophobia?
When I started this book it was early 2016, before President Trump was elected. I never imagined how timely it would be. It really started as an exploration of, in a way, family history through American political history.
Knowing that history, knowing how recent [Asian Americans'] arrival is as a large racial group in this country, helps me to process what's happening now. Because I think part of what the xenophobia is revealing is just how tenuous, in a way, the Asian American political category can be. It's a group that often lacks a lot of political power and political voice.
I think of ourselves as very much in the tradition of other immigrants who've sort of come before, each of whom has also kind of had to establish their place in America.
For people like me, who are children of immigrants, who were able to come here because of the 1965 law, it's a chance to say, ‘Okay, this is our political history as a people. This is how we got here.’
Anna Diamond is the former assistant editor for Smithsonian magazine.
|
3c9b23b4446085e8cad0f6a40a1dc5b7 | https://www.smithsonianmag.com/history/1945-japanese-balloon-bomb-killed-six-americansfive-them-children-oregon-180972259/ | In 1945, a Japanese Balloon Bomb Killed Six Americans, Five of Them Children, in Oregon | In 1945, a Japanese Balloon Bomb Killed Six Americans, Five of Them Children, in Oregon
Elsye Mitchell almost didn’t go on the picnic that sunny day in Bly, Oregon. She had baked a chocolate cake the night before in anticipation of their outing, her sister would later recall, but the 26-year-old was pregnant with her first child and had been feeling unwell. On the morning of May 5, 1945, she decided she felt decent enough to join her husband, Rev. Archie Mitchell, and a group of Sunday school children from their tight-knit community as they set out for nearby Gearhart Mountain in southern Oregon. Against a scenic backdrop far removed from the war raging across the Pacific, Mitchell and five other children would become the first—and only—civilians to die by enemy weapons on the United States mainland during World War II.
While Archie parked their car, Elsye and the children stumbled upon a strange-looking object in the forest and shouted back to him. The reverend would later describe that tragic moment to local newspapers: “I…hurriedly called a warning to them, but it was too late. Just then there was a big explosion. I ran up – and they were all lying there dead.” Lost in an instant were his wife and unborn child, alongside Eddie Engen, 13, Jay Gifford, 13, Sherman Shoemaker, 11, Dick Patzke, 14, and Joan “Sis” Patzke, 13.
Dottie McGinnis, sister of Dick and Joan Patzke, later recalled to her daughter in a family memory book the shock of coming home to cars gathered in the driveway, and the devastating news that two of her siblings and friends from the community were gone. “I ran to one of the cars and asked is Dick dead? Or Joan dead? Is Jay dead? Is Eddie dead? Is Sherman dead? Archie and Elsye had taken them on a Sunday school picnic up on Gearhart Mountain. After each question they answered yes. At the end they all were dead except Archie.” Like most in the community, the Patzke family had no inkling that the dangers of war would reach their own backyard in rural Oregon.
But the eyewitness accounts of Archie Mitchell and others would not be widely known for weeks. In the aftermath of the explosion, the small, lumber milling community would bear the added burden of enforced silence. For Rev. Mitchell and the families of the children lost, the unique circumstances of their devastating loss would be shared by none and known by few.
In the months leading up to that spring day on Gearhart Mountain, there had been some warning signs, apparitions scattered around the western United States that were largely unexplained—at least to the general public. Flashes of light, the sound of explosion, the discovery of mysterious fragments—all amounted to little concrete information to go on. First, the discovery of a large balloon miles off the California coast by the Navy on November 4, 1944. A month later, on December 6, 1944, witnesses reported an explosion and flame near Thermopolis, Wyoming. Reports of fallen balloons began to trickle in to local law enforcement with enough frequency that it was clear something unprecedented in the war had emerged that demanded explanation. Military officials began to piece together that a strange new weapon, with markings indicating it had been manufactured in Japan, had reached American shores. They did not yet know the extent or capability or scale of these balloon bombs.
Though relatively simple as a concept, these balloons—which aviation expert Robert C. Mikesh describes in Japan’s World War II Balloon Bomb Attacks on North America as the first successful intercontinental weapons, long before that concept was a mainstay in the Cold War vernacular—required more than two years of concerted effort and cutting-edge technology engineering to bring into reality. Japanese scientists carefully studied what would become commonly known as the jet stream, realizing these currents of wind could enable balloons to reach United States shores in just a couple of days. The balloons remained afloat through an elaborate mechanism that triggered a fuse when the balloon dropped in altitude, releasing a sandbag and lightening the weight enough for it to rise back up. This process would repeat until all that remained was the bomb itself. By then, the balloons would be expected to reach the mainland; an estimated 1,000 out of 9,000 launched made the journey. Between the fall of 1944 and summer of 1945, several hundred incidents connected to the balloons had been cataloged.
The balloons not only required engineering acumen, but a massive logistical effort. Schoolgirls were conscripted to labor in factories manufacturing the balloons, which were made of endless reams of paper and held together by a paste made of konnyaku, a potato-like vegetable. The girls worked long, exhausting shifts, their contributions to this wartime project shrouded in silence. The massive balloons would then be launched, timed carefully to optimize the wind currents of the jet stream and reach the United States. Engineers hoped that the weapons’ impact would be compounded by forest fires, inflicting terror through both the initial explosion and an ensuing conflagration. That goal was stymied in part by the fact that they arrived during the rainy season, but had this goal been realized, these balloons may have been much more than an overlooked episode in a vast war.
As reports of isolated sightings (and theories on how they got there, ranging from submarines to saboteurs) made their way into a handful of news reports over the Christmas holiday, government officials stepped in to censor stories about the bombs, worrying that fear itself might soon magnify the effect of these new weapons. The reverse principle also applied—while the American public was largely in the dark in the early months of 1945, so were those who were launching these deadly weapons. Japanese officers later told the Associated Press that “they finally decided the weapon was worthless and the whole experiment useless, because they had repeatedly listened to [radio broadcasts] and had heard no further mention of the balloons.” Ironically, the Japanese had ceased launching them shortly before the picnicking children had stumbled across one.
However successful censorship had been in discouraging further launches, this very censorship “made it difficult to warn the people of the bomb danger,” writes Mikesh. “The risk seemed justified as weeks went by and no casualties were reported.” After that luck ran out with the Gearheart Mountain deaths, officials were forced to rethink their approach. On May 22, the War Department issued a statement confirming the bombs’ origin and nature “so the public may be aware of the possible danger and to reassure the nation that the attacks are so scattered and aimless that they constitute no military threat.” The statement was measured to provide sufficient information to avoid further casualties, but without giving the enemy encouragement. But by then, Germany’s surrender dominated headlines. Word of the Bly, Oregon, deaths—and the strange mechanism that had killed them – was overshadowed by the dizzying pace of the finale in the European theater.
The silence meant that for decades, grieving families were sometimes met with skepticism or outright disbelief. The balloon bombs have been so overlooked that during the making of the documentary On Paper Wings, several of those who lost family members told filmmaker Ilana Sol of reactions to their unusual stories. “They would be telling someone about the loss of their sibling and that person just didn’t believe them,” Sol recalls.
While much of the American public may have forgotten, the families in Bly never would. The effects of that moment would reverberate throughout the Mitchell family, shifting the trajectory of their lives in unexpected ways. Two years later, Rev. Mitchell would go on to marry the Betty Patzke, the elder sibling out of ten children in Dick and Joan Patzke’s family (they lost another brother fighting in the war), and fulfill the dream he and Elsye once shared of going overseas as missionaries. (Rev. Mitchell was later kidnapped from a leprosarium while he and Betty were serving as missionaries in Vietnam; 57 years later his fate remains unknown).
“When you talk about something like that, as bad as it seems when that happened and everything, I look at my four children, they never would have been, and I’m so thankful for all four of my children and my ten grandchildren. They wouldn’t have been if that tragedy hadn’t happened,” Betty Mitchell told Sol in an interview.
The Bly incident also struck a chord decades later in Japan. In the late 1980s, University of Michigan professor Yuzuru “John” Takeshita, who as a child had been incarcerated as a Japanese-American in California during the war and was committed to healing efforts in the decades after, learned that the wife of a childhood friend had built the bombs as a young girl. He facilitated a correspondence between the former schoolgirls and the residents of Bly whose community had been turned upside down by one of the bombs they built. The women folded 1,000 paper cranes as a symbol of regret for the lives lost. On Paper Wings shows them meeting face-to-face in Bly decades later. Those gathered embodied a sentiment echoed by the Mitchell family. “It was a tragic thing that happened,” says Judy McGinnis-Sloan, Betty Mitchell’s niece. “But they have never been bitter over it.”
These loss of these six lives puts into relief the scale of loss in the enormity of a war that swallowed up entire cities. At the same time as Bly residents were absorbing the loss they had endured, over the spring and summer of 1945 more than 60 Japanese cities burned – including the infamous firebombing of Tokyo. On August 6, 1945, the first atomic bomb was dropped on the city of Hiroshima, followed three days later by another on Nagasaki. In total, an estimated 500,000 or more Japanese civilians would be killed. Sol recalls “working on these interviews and just thinking my God, this one death caused so much pain, what if it was everyone and everything? And that’s really what the Japanese people went through.”
In August of 1945, days after Japan announced its surrender, nearby Klamath Falls’ Herald and News published a retrospective, noting that “it was only by good luck that other tragedies were averted” but noted that balloon bombs still loomed in the vast West that likely remained undiscovered. “And so ends a sensational chapter of the war,” it noted. “But Klamathites were reminded that it still can have a tragic sequel.”
While the tragedy of that day in Bly has not been repeated, the sequel remains a real—if remote—possibility. In 2014, a couple of forestry workers in Canada came across one of the unexploded balloon bombs, which still posed enough of a danger that a military bomb disposal unit had to blow it up. Nearly three-quarters of a century later, these unknown remnants are a reminder that even the most overlooked scars of war are slow to fade.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.