text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Special:EditPage/Template:Meta_Platforms] | [TOKENS: 1446] |
Editing Template:Meta Platforms Copy and paste: – — ° ′ ″ ≈ ≠ ≤ ≥ ± − × ÷ ← → · § Sign your posts on talk pages: ~~~~ Cite your sources: <ref></ref> {{}} {{{}}} | [] [[]] [[Category:]] #REDIRECT [[]] <s></s> <sup></sup> <sub></sub> <code></code> <pre></pre> <blockquote></blockquote> <ref></ref> <ref name="" /> {{Reflist}} <references /> <includeonly></includeonly> <noinclude></noinclude> {{DEFAULTSORT:}} <nowiki></nowiki> <!-- --> <span class="plainlinks"></span> Symbols: ~ | ¡ ¿ † ‡ ↔ ↑ ↓ • ¶ # ∞ ‹› «» ¤ ₳ ฿ ₵ ¢ ₡ ₢ $ ₫ ₯ € ₠ ₣ ƒ ₴ ₭ ₤ ℳ ₥ ₦ ₧ ₰ £ ៛ ₨ ₪ ৳ ₮ ₩ ¥ ♠ ♣ ♥ ♦ 𝄫 ♭ ♮ ♯ 𝄪 © ¼ ½ ¾ Latin: A a Á á À à  â Ä ä Ǎ ǎ Ă ă Ā ā à ã Å å Ą ą Æ æ Ǣ ǣ B b C c Ć ć Ċ ċ Ĉ ĉ Č č Ç ç D d Ď ď Đ đ Ḍ ḍ Ð ð E e É é È è Ė ė Ê ê Ë ë Ě ě Ĕ ĕ Ē ē Ẽ ẽ Ę ę Ẹ ẹ Ɛ ɛ Ǝ ǝ Ə ə F f G g Ġ ġ Ĝ ĝ Ğ ğ Ģ ģ H h Ĥ ĥ Ħ ħ Ḥ ḥ I i İ ı Í í Ì ì Î î Ï ï Ǐ ǐ Ĭ ĭ Ī ī Ĩ ĩ Į į Ị ị J j Ĵ ĵ K k Ķ ķ L l Ĺ ĺ Ŀ ŀ Ľ ľ Ļ ļ Ł ł Ḷ ḷ Ḹ ḹ M m Ṃ ṃ N n Ń ń Ň ň Ñ ñ Ņ ņ Ṇ ṇ Ŋ ŋ O o Ó ó Ò ò Ô ô Ö ö Ǒ ǒ Ŏ ŏ Ō ō Õ õ Ǫ ǫ Ọ ọ Ő ő Ø ø Œ œ Ɔ ɔ P p Q q R r Ŕ ŕ Ř ř Ŗ ŗ Ṛ ṛ Ṝ ṝ S s Ś ś Ŝ ŝ Š š Ş ş Ș ș Ṣ ṣ ß T t Ť ť Ţ ţ Ț ț Ṭ ṭ Þ þ U u Ú ú Ù ù Û û Ü ü Ǔ ǔ Ŭ ŭ Ū ū Ũ ũ Ů ů Ų ų Ụ ụ Ű ű Ǘ ǘ Ǜ ǜ Ǚ ǚ Ǖ ǖ V v W w Ŵ ŵ X x Y y Ý ý Ŷ ŷ Ÿ ÿ Ỹ ỹ Ȳ ȳ Z z Ź ź Ż ż Ž ž ß Ð ð Þ þ Ŋ ŋ Ə ə Greek: Ά ά Έ έ Ή ή Ί ί Ό ό Ύ ύ Ώ ώ Α α Β β Γ γ Δ δ Ε ε Ζ ζ Η η Θ θ Ι ι Κ κ Λ λ Μ μ Ν ν Ξ ξ Ο ο Π π Ρ ρ Σ σ ς Τ τ Υ υ Φ φ Χ χ Ψ ψ Ω ω {{Polytonic|}} Cyrillic: А а Б б В в Г г Ґ ґ Ѓ ѓ Д д Ђ ђ Е е Ё ё Є є Ж ж З з Ѕ ѕ И и І і Ї ї Й й Ј ј К к Ќ ќ Л л Љ љ М м Н н Њ њ О о П п Р р С с Т т Ћ ћ У у Ў ў Ф ф Х х Ц ц Ч ч Џ џ Ш ш Щ щ Ъ ъ Ы ы Ь ь Э э Ю ю Я я ́ IPA: t̪ d̪ ʈ ɖ ɟ ɡ ɢ ʡ ʔ ɸ β θ ð ʃ ʒ ɕ ʑ ʂ ʐ ç ʝ ɣ χ ʁ ħ ʕ ʜ ʢ ɦ ɱ ɳ ɲ ŋ ɴ ʋ ɹ ɻ ɰ ʙ ⱱ ʀ ɾ ɽ ɫ ɬ ɮ ɺ ɭ ʎ ʟ ɥ ʍ ɧ ʼ ɓ ɗ ʄ ɠ ʛ ʘ ǀ ǃ ǂ ǁ ɨ ʉ ɯ ɪ ʏ ʊ ø ɘ ɵ ɤ ə ɚ ɛ œ ɜ ɝ ɞ ʌ ɔ æ ɐ ɶ ɑ ɒ ʰ ʱ ʷ ʲ ˠ ˤ ⁿ ˡ ˈ ˌ ː ˑ ̪ {{IPA|}} Wikidata entities used in this page Pages transcluded onto the current version of this page (help): |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Siachen_Glacier] | [TOKENS: 3160] |
Contents Siachen Glacier The Siachen Glacier is a glacier located in the eastern Karakoram range of the Himalayas, just northeast of the point NJ9842 where the Line of Control between India and Pakistan ends in northeastern Kashmir. At 76 km (47 mi) long, it is the longest glacier in the Karakoram and second-longest in the world's non-polar areas. It falls from an altitude of 5,753 m (18,875 ft) above sea level at its head at Indira Col on the India–China border down to 3,620 m (11,875 ft) at its terminus. The entire Siachen Glacier, with all major passes, has been under the administration of India as part of the union territory of Ladakh since 1984. Pakistan maintains a territorial claim over the Siachen Glacier and controls the region west of Saltoro Ridge, lying west of the glacier, with Pakistani posts located 1 km below more than 100 Indian posts on the ridge. The Siachen Glacier lies immediately south of the great drainage divide that separates the Eurasian Plate from the Indian subcontinent in the extensively glaciated portion of the Karakoram sometimes called the "Third Pole". The glacier lies between the Saltoro Ridge immediately to the west and the main Karakoram range to the east. The Saltoro Ridge originates in the north from the Sia Kangri peak on the China border in the Karakoram range. The crest of the Saltoro Ridge's altitudes range from 5,450 to 7,720 m (17,880 to 25,330 feet). The major passes on this ridge are, from north to south, Sia La at 5,589 m (18,336 ft), Bilafond La at 5,450 m (17,880 ft), and Gyong La at 5,689 m (18,665 ft). The average winter snowfall is more than 1000 cm (35 ft) and temperatures can dip to −50 °C (−58 °F). Including all tributary glaciers, the Siachen Glacier system covers about 700 km2 (270 sq mi). Etymology "Sia" in the Balti language refers to the rose family plant widely dispersed in the region. "Chen" refers to any object found in abundance. Thus the name Siachen refers to a land with an abundance of roses. The naming of the glacier itself, or at least its currency, is attributed to Tom Longstaff. Dispute Both India and Pakistan claim sovereignty over the entire Siachen region. In June 1958, first Geological Survey of India expedition went to the Siachen glacier. It was the first official Indian survey of Siachen Glacier by Geological Survey of India post-1947 and that was undertaken to commemorate the International Geophysical Year in 1958. The study included snout surveying of five glaciers namely Siachen, Mamostong, Chong Kumdan, Kichik Kumdan and Aktash Glaciers in Ladakh region. 5Q 131 05 084 was the number assigned to the Siachen glacier by the expedition. U.S. and Pakistani maps in the 1970s and 1980s consistently showed a dotted line from NJ9842 (the northernmost demarcated point of the India-Pakistan cease-fire line, also known as the Line of Control) to the Karakoram Pass, which India believed to be a cartographic error and in violation of the Simla Agreement. In 1984, India launched Operation Meghdoot, a military operation that gave India control over all of the Siachen Glacier, including its tributaries. Between 1984 and 1999, frequent skirmishes took place between India and Pakistan. Indian troops under Operation Meghdoot pre-empted Pakistan's Operation Ababeel by just one day to occupy most of the dominating heights on Saltoro Ridge to the west of Siachen Glacier. However, more soldiers have died from the harsh weather conditions in the region than from combat. Pakistan lost 353 soldiers in various operations recorded between 2003 and 2010 near Siachen, including 140 Pakistanis killed in the 2012 Gayari Sector avalanche. Between January 2012 and July 2015, 33 Indian soldiers died due to adverse weather. In December 2015, Indian Union Minister of State for Defence Rao Inderjit Singh said in a written reply in the Lok Sabha that a total of 869 Army personnel have died on the Siachen glacier due to climatic conditions and environmental and other factors from the date that the Army launched Operation Meghdoot in 1984. In February 2016, Indian Defence Minister Manohar Parrikar stated that India will not vacate Siachen, as there is a trust deficit with Pakistan and also said that 915 people have died in Siachen since Operation Meghdoot in 1984. According to official records, only 220 Indian soldiers have been killed by enemy bullets since 1984 in Siachen area. Both India and Pakistan continue to deploy thousands of troops in the vicinity of Siachen and attempts to demilitarize the region have been so far unsuccessful. Prior to 1984, neither country had any military forces in this area. Aside from the Indian and Pakistani military presence, the glacier region is unpopulated. The nearest civilian settlement is the village of Warshi, 10 miles downstream from the Indian base camp. The region is also extremely remote, with limited road connectivity. On the Indian side, roads go only as far as the military base camp at Dzingrulma (35°09′59″N 77°12′58″E / 35.1663°N 77.2162°E / 35.1663; 77.2162), 72 km from the head of the glacier. The Indian Army has developed various means to reach the Siachen region, including the Manali-Leh-Khardung La-Siachen route. In 2012, Chief of Army Staff of the Indian Army General Bikram Singh said that the Indian Army should stay in the region for strategic advantages, and because a "lot of blood has been shed" by Indian armed personnel for Siachen. The present ground positions, relatively stable for over a decade, mean that India maintains control over all of the 76 kilometres (47 mi) Siachen Glacier and all of its tributary glaciers, as well as all the main passes and heights of the Saltoro Ridge immediately west of the glacier, including Sia La, Bilafond La, Gyong La, Yarma La (6,100m), and Chulung La [ceb] (5,800m). Pakistan controls the glacial valleys immediately west of the Saltoro Ridge. According to TIME magazine, India gained over 1,000 square miles (3,000 km2) in territory because of its 1980s military operations in Siachen. India has categorically stated that India will not pull its army from Siachen until the 110-km long AGPL is first authenticated, delineated and then demarcated. The 1949 Karachi agreement only carefully delineated the line of separation to point NJ9842, after which, the agreement states, the line of separation would continue "thence north to the glaciers". According to the Indian stance, the line of separation should continue roughly northwards along the Saltoro Range to the west of the Siachen glacier beyond NJ9842; international boundary lines that follow mountain ranges often do so by following the watershed drainage divide such as that of the Saltoro Range. The 1972 Simla Agreement made no change to the 1949 Line of Control in this northernmost sector. Drainage The glacier's melting waters are the main source of the Nubra River in the Indian region of Ladakh, which drains into the Shyok River. The Shyok in turn joins the 3000 kilometre-long Indus River which flows through Pakistan. Thus, the glacier is a major source of the Indus and feeds the largest irrigation system in the world. Environmental issues The glacier was uninhabited before 1984, and the presence of thousands of troops since then has introduced pollution and melting to the glacier. To support the troops, glacial ice has been cut and melted with chemicals. Dumping of non-biodegradable waste in large quantities and the use of arms and ammunition have considerably affected the ecosystem of the region. Preliminary findings of a survey by Pakistan Meteorological Department in 2007 revealed that the Siachen glacier has been retreating for the past 30 years and is melting at an alarming rate. The study of satellite images of the glacier showed that the glacier is retreating at a rate of about 110 metres a year and that the glacier size has decreased by almost 35 percent. In an eleven-year period, the glacier had receded nearly 800 metres, and in seventeen years about 1700 metres. It is predicted that the glaciers of the Siachen region will be reduced to about one-fifth of their 2011 size by 2035. In the twenty-nine-year period 1929–1958, well before the military occupation, the glacial retreat was recorded to be about 914 metres. One of the reasons theorized for the recent glacial retreat is chemical blasting, to construct camps and posts. In 2001 India laid oil pipelines (about 250 kilometres long) inside the glacier to supply kerosene and aviation fuel to the outposts from base camps. As of 2007, the temperature rise at Siachen was estimated at 0.2-degree Celsius annually, causing melting, avalanches, and crevasses in the glacier. The waste produced by the troops stationed there is dumped in the crevasses of the glacier. Mountaineers who visited the area while on climbing expeditions witnessed large amount of garbage, empty ammunition shells, parachutes etc. dumped on the glacier, that neither decomposes nor can be burned because of the extreme climatic conditions. About 1,000 kilograms (1.1 short tons) of waste is produced and dumped in glacial crevasses daily by Indian forces. The Indian army is said to have planned a "Green Siachen, Clean Siachen" campaign to airlift the garbage from the glacier, and to use biodigestors for biodegradable waste in the absence of oxygen and freezing temperatures. Almost forty percent (40%) of the waste left at the glacier is plastic and metal, including toxins such as cobalt, cadmium and chromium that eventually affect the water of the Shyok River (which ultimately enters the Indus River near Skardu). The Indus is used for drinking and irrigation. Research is being done by scientists of The Energy and Resources Institute, to find ways to successfully dispose of the garbage generated at the glacier using scientific means. Some scientists of the Defence Research and Development Organisation who went on an expedition to Antarctica are also working to produce a bacterium that can thrive in extreme weather conditions and can be helpful in decomposing the biodegradable waste naturally. The flora and fauna of the Siachen region are also affected by the huge military presence. The region is home to rare species including snow leopard, brown bear and ibex that are at risk because of the military presence. Border conflict The glacier's region is the highest battleground on Earth, where Pakistan and India have fought intermittently since April 1984. Both countries maintain a permanent military presence in the region at a height of over 6,000 m (20,000 ft). Both India and Pakistan have wished to disengage from the costly military outposts. India launched Operation Meghdoot to occupy Siachen Glacier in 1984. Then, due to the Pakistani incursions during the Kargil War in 1999, India abandoned plans to withdraw from Siachen, wary of further Pakistani incursions if they vacate the Siachen Glacier posts. Prime Minister Manmohan Singh became the first Indian Prime Minister to visit the area, during which he called for a peaceful resolution of the problem. After that present Prime Minister Narendra Modi also visited this place. President of Pakistan Asif Ali Zardari also visited an area near the Siachen Glacier called Gayari Sector during 2012 with Pakistan Army Chief Gen. Ashfaq Parvez Kayani. Both of them showed their commitment to resolve the Siachen conflict as early as possible. In the previous year, the President of India, Abdul Kalam became the first head of state to visit the area. Since September 2007, India has opened up limited mountaineering and trekking expeditions to the area. The first group included cadets from Chail Military School, National Defence Academy, National Cadet Corps, Indian Military Academy, Rashtriya Indian Military College and family members of armed forces officers. The expeditions are also meant to show to the international audience that Indian troops hold "almost all dominating heights" on the key Saltoro Ridge and to show that Pakistani troops are nowhere near the Siachen Glacier. Ignoring protests from Pakistan, India maintains that it does not need anyone's approval to send trekkers to Siachen, in what it says is essentially its own territory. In addition, the Indian Army's Army Mountaineering Institute (AMI) functions out of the region. Peace Park proposal The idea of declaring the Siachen region a "Peace Park" was presented by environmentalists and peace activists in part to preserve the ecosystem of the region badly affected by the military presence. In September 2003, the governments of India and Pakistan were urged by the participants of the 5th World Parks Congress held at Durban, to establish a peace park in the Siachen region to restore the natural biological system and protect species whose lives are at risk. Italian ecologist Giuliano Tallone said the ecological life was at serious risk, and proposed setting up a Siachen Peace Park at the conference. After a proposal of a transboundary Peace Park was floated, the International Mountaineering and Climbing Federation (UIAA) and the International Union for Conservation of Nature (IUCN) organised a conference at Geneva and invited Indian and Pakistani mountaineers (Mandip Singh Soin, Harish Kapadia, Nazir Sabir and Sher Khan). The region was nominated for inclusion in the United Nations' World Heritage List as a part of the Karakoram range, but this was deferred by the World Heritage Committee. The areas to the east and west of the Siachen region have already been declared national parks: the Karakoram Wildlife Sanctuary in India and the Central Karakoram National Park in Pakistan. Sandia National Laboratories organised conferences where military experts and environmentalists from both India and Pakistan and also from other countries were invited to present joint papers. Kent L. Biringer, a researcher at Cooperative Monitoring Center of Sandia Labs suggested setting up Siachen Science Center, a high-altitude research centre where scientists and researchers from both the countries can carry out research activities related to glaciology, geology, atmospheric sciences and other related fields. In popular culture In the 2018 Hollywood movie Mission: Impossible – Fallout, a rogue agent plants nuclear bombs at the base of the Siachen Glacier. The scene was actually filmed in Preikestolen, Norway, due to the Indian government denying permission to film in Kashmir. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://www.mako.co.il/entertainment-celebs/local-2026/Article-7e4ac0f6d657c91026.htm] | [TOKENS: 101] |
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/entertainment-celebs/local-2026/Article-7e4ac0f6d657c91026.htm from 79.181.162.231 on February 21 2026, 10:56:09 UTC |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_ref-123] | [TOKENS: 9291] |
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Special:BookSources/978-0-7637-5730-4] | [TOKENS: 380] |
Contents Book sources This page allows users to search multiple sources for a book given a 10- or 13-digit International Standard Book Number. Spaces and dashes in the ISBN do not matter. This page links to catalogs of libraries, booksellers, and other book sources where you will be able to search for the book by its International Standard Book Number (ISBN). Online text Google Books and other retail sources below may be helpful if you want to verify citations in Wikipedia articles, because they often let you search an online version of the book for specific words or phrases, or you can browse through the book (although for copyright reasons the entire book is usually not available). At the Open Library (part of the Internet Archive) you can borrow and read entire books online. Online databases Subscription eBook databases Libraries Alabama Alaska California Colorado Connecticut Delaware Florida Georgia Illinois Indiana Iowa Kansas Kentucky Massachusetts Michigan Minnesota Missouri Nebraska New Jersey New Mexico New York North Carolina Ohio Oklahoma Oregon Pennsylvania Rhode Island South Carolina South Dakota Tennessee Texas Utah Washington state Wisconsin Bookselling and swapping Find your book on a site that compiles results from other online sites: These sites allow you to search the catalogs of many individual booksellers: Non-English book sources If the book you are looking for is in a language other than English, you might find it helpful to look at the equivalent pages on other Wikipedias, linked below – they are more likely to have sources appropriate for that language. Find other editions The WorldCat xISBN tool for finding other editions is no longer available. However, there is often a "view all editions" link on the results page from an ISBN search. Google books often lists other editions of a book and related books under the "about this book" link. You can convert between 10 and 13 digit ISBNs with these tools: Find on Wikipedia See also Get free access to research! Research tools and services Outreach Get involved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_ref-controllersquare_174-1] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_ref-VanceA1_47-1] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://www.theverge.com/news/717754/tesla-autopilot-crash-liable-jury-trial-damages] | [TOKENS: 1673] |
NewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsTransportationCloseTransportationPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TransportationElectric CarsCloseElectric CarsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All Electric CarsTesla to pay more than $200 million in damages after being found partly liable for fatal Autopilot crashA jury ruled that Tesla was 33 percent responsible for the crash.A jury ruled that Tesla was 33 percent responsible for the crash.by Emma RothCloseEmma RothNews WriterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Emma Roth and Jay PetersCloseJay PetersSenior ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay PetersAug 1, 2025, 8:57 PM UTCLinkShareGiftCath Virginia / The Verge | Photo from Getty ImagesA federal jury in Florida found Tesla partly liable for a deadly 2019 crash involving Tesla’s Autopilot driver assist software, according to reports from The New York Times and CNBC. Tesla has been ordered to pay $200 million in punitive damages and about $43 million in compensatory damages, CBS News reports.It’s a rare loss in court for Tesla over Autopilot, which has been linked to hundreds of crashes and dozens of deaths by the National Highway Traffic Safety Administration. The company won two jury trials in 2023 resulting from lawsuits alleging that Autopilot was to blame for crashes, and last year, a lawsuit challenging Tesla’s claims about Autopilot was dismissed by a federal judge. The loss also comes as Tesla is starting to test its robotaxi service in Austin and Bay Area — though in the latter location, it arguably isn’t a robotaxi service just yet.Tesla’s Autopilot feature is designed to control a vehicle’s steering and brakes; however, some argue that the EV-maker has misled drivers about its cars’ capabilities. The California Department of Motor Vehicles, for example, has accused Tesla of falsely advertising its Autopilot and Full-Self Driving capabilities as autonomous driving features.During the trial, which started in July, plaintiffs argued that Tesla’s driver-assist software was at fault for causing a crash that killed 22-year-old Naibel Benavides. While driving in Key Largo, Florida, Tesla owner George McGee crashed into Benavides’ vehicle after bending over to grab a phone that he had dropped. McGee told the jury he thought Autopilot “would protect him and prevent a serious crash if he made a mistake,” according to the NYT.“Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology,” the company said in a statement to the NYT. The company plans to appeal.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Emma RothCloseEmma RothNews WriterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Emma RothJay PetersCloseJay PetersSenior ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay PetersElectric CarsCloseElectric CarsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All Electric CarsNewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsTeslaCloseTeslaPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TeslaTransportationCloseTransportationPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TransportationMost PopularMost PopularThe RAM shortage is coming for everything you care aboutA $10K+ bounty is waiting for anyone who can unplug Ring doorbells from Amazon’s cloudMeta’s VR metaverse is ditching VRTurtle Beach’s new PC controller with swiveling sticks is 30 percent offAmazon blames human employees for an AI coding agent’s mistakeThe Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Transportation Posts from this topic will be added to your daily email digest and your homepage feed. See All Electric Cars Tesla to pay more than $200 million in damages after being found partly liable for fatal Autopilot crash A jury ruled that Tesla was 33 percent responsible for the crash. A jury ruled that Tesla was 33 percent responsible for the crash. Posts from this author will be added to your daily email digest and your homepage feed. See All by Emma Roth Posts from this author will be added to your daily email digest and your homepage feed. See All by Jay Peters A federal jury in Florida found Tesla partly liable for a deadly 2019 crash involving Tesla’s Autopilot driver assist software, according to reports from The New York Times and CNBC. Tesla has been ordered to pay $200 million in punitive damages and about $43 million in compensatory damages, CBS News reports. It’s a rare loss in court for Tesla over Autopilot, which has been linked to hundreds of crashes and dozens of deaths by the National Highway Traffic Safety Administration. The company won two jury trials in 2023 resulting from lawsuits alleging that Autopilot was to blame for crashes, and last year, a lawsuit challenging Tesla’s claims about Autopilot was dismissed by a federal judge. The loss also comes as Tesla is starting to test its robotaxi service in Austin and Bay Area — though in the latter location, it arguably isn’t a robotaxi service just yet. Tesla’s Autopilot feature is designed to control a vehicle’s steering and brakes; however, some argue that the EV-maker has misled drivers about its cars’ capabilities. The California Department of Motor Vehicles, for example, has accused Tesla of falsely advertising its Autopilot and Full-Self Driving capabilities as autonomous driving features. During the trial, which started in July, plaintiffs argued that Tesla’s driver-assist software was at fault for causing a crash that killed 22-year-old Naibel Benavides. While driving in Key Largo, Florida, Tesla owner George McGee crashed into Benavides’ vehicle after bending over to grab a phone that he had dropped. McGee told the jury he thought Autopilot “would protect him and prevent a serious crash if he made a mistake,” according to the NYT. “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology,” the company said in a statement to the NYT. The company plans to appeal. Posts from this author will be added to your daily email digest and your homepage feed. See All by Emma Roth Posts from this author will be added to your daily email digest and your homepage feed. See All by Jay Peters Posts from this topic will be added to your daily email digest and your homepage feed. See All Electric Cars Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Tesla Posts from this topic will be added to your daily email digest and your homepage feed. See All Transportation Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in News This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/International_Herald_Tribune] | [TOKENS: 1575] |
Contents International Herald Tribune The International Herald Tribune (IHT) was a daily English-language newspaper published in Paris, France, for international English-speaking readers. It published under the name International Herald Tribune starting in 1967, but its origins as an international newspaper trace back to 1887. Sold in over 160 countries, the International Herald Tribune produced a large amount of content until it became the second incarnation of The International New York Times in 2013, 10 years after The New York Times Company became its sole owner. Early years In 1887, James Gordon Bennett Jr. created a Paris edition of his newspaper the New York Herald with offices at 49, avenue de l'Opéra. He called it the Paris Herald. When Bennett Jr. died, the Herald and its Paris edition came under the control of Frank Munsey. In 1924, Munsey sold the paper to the family of Ogden Reid, owners of the New-York Tribune, creating the New York Herald Tribune, while the Paris edition became the Paris Herald Tribune. By 1967, the paper was owned jointly by Whitney Communications, The Washington Post and The New York Times, and became known as the International Herald Tribune, or IHT. The International Herald Tribune years The first issue of the International Herald Tribune was published on May 22, 1967. It continued the practices that had endeared it to American expatriates and travelers, such as carrying baseball scores and stock prices. At the start, the paper maintained the offices it inherited from the Herald Tribune European Edition – that dated to 1931 – at 21 Rue de Berri, just off the Champs-Élysées. Columnist Art Buchwald recalled them as being "grubby" and antiquated but "the perfect location for an American newspaper abroad." Then in 1978, the paper moved its facilities to the Parisian suburb of Neuilly-sur-Seine. In 1974, the paper pioneered the innovation of doing electronic transmission of facsimile pages across borders, when it opened a remote printing facility in London. This was followed by a printing site in Zurich in 1977. The International Herald Tribune began transmitting electronic images of newspaper pages from Paris to Hong Kong via satellite in 1980, making the paper simultaneously available on opposite sides of the planet. This was the first such intercontinental transmission of an English-language daily newspaper and followed the pioneering efforts of the Chinese-language newspaper Sing Tao Daily (星島日報).[citation needed] Additional printing locations followed, including Rome and Tokyo 1987; and Frankfurt 1989. By 1985, the International Herald Tribune had a circulation of 160,000, and was profitable with annual revenues of around $40 million. At the time of the paper's centennial in 1987, the IHT was opening a new print site on average each year. By the early 1990s, the paper was printed concurrently around the globe, with seven sites in Europe, three in Asia, and one in America, allowing day-of-publication availability in all major cities worldwide. Notably, every region received the same editorial content, and even most of the advertising ran across all areas; by comparison, the international edition of The Wall Street Journal was heavily regionalized. (Several editions were published of each day's paper, however, and sometimes particular regions saw revisions that other regions might not.) Nearly 200,000 copies were sold per day, including 50,000 in Asia and 45,000 copies to airlines flying international routes. Despite the technology, however, in practice stories often appeared in the International Herald Tribune a day after they appeared in either of the parent papers. Marking a departure from its origins as a paper mostly read by American expatriates and travelers in Europe, by this point the majority of its readers were non-American. The International Herald Tribune's main editorial team was based in Paris, and while content for the paper largely consisted of stories, columns, and editorials from the two parent papers, the paper reported from many news sources, including its own corps of correspondents and columnists. In any case, all of the final editing was done by the Paris staff. By 2002, the International Herald Tribune had some 335 employees. Some columnists from the parent papers, such as Flora Lewis and Art Buchwald, kept publishing columns in the International Herald Tribune even after their work no longer appeared in the parent publications. Over the years, the International Herald Tribune faced increasing newsstand competition from the international editions of the Wall Street Journal, USA Today, and the Financial Times. Furthermore, the advent of the internationally available cable news network CNN, and later the Internet, gave Americans more readily available ways to keep up on sports scores and the like. As the 21st century dawned, there were divided opinions regarding the International Herald Tribune's place in the media world, with for instance James Ledbetter of Slate pronouncing it a relic of a by-gone era but Peter Osnos of The Atlantic believing it still had a role to play. In October 2002, it was announced that The New York Times Company ("The Times") would buy out the Post's interest, for an amount of around $70 million. The Times thereby became the sole owner of the International Herald Tribune. The change became effective with the edition published on January 2, 2003. The headquarters for the paper remained at its site in Neuilly-sur-Seine. The Times subsequently folded the International Herald Tribune website into its own website during 2009. In 2005 the paper opened its Asia newsroom in Hong Kong. In April 2001, the Japanese newspaper The Asahi Shimbun (朝日新聞) tied up with the International Herald Tribune and published an English-language newspaper, the International Herald Tribune/Asahi Shimbun. After the Washington Post sold its stake in the International Herald Tribune, it continued being published under the name International Herald Tribune/Asahi Shimbun, but it was discontinued in February 2011. By 2008, the circulation of the paper was over 240,000. By the early 2010s, the Internet edition of the paper was receiving some seven million visitors per month, and overall the IHT represented one of the biggest global media entities. Writers and journalists Throughout its history the Paris-based paper had a renowned stable of writers and journalists. Among the most well-known were the humorist Art Buchwald, the fashion editor Suzy Menkes, jazz critic Mike Zwerin and food writers Waverly Root and Patricia Wells. Former executive editors include Philip Manning Foisie, John Vinocur, David Ignatius and Michael Getler. The final years In 2013, the New York Times Company announced that the International Herald Tribune was being renamed The International New York Times. On October 14, 2013, a Monday, the International Herald Tribune appeared on newsstands for the last time and ceased publication under that name. In 2016, the NYT Paris offices, acquired from the IHT, closed amid massive layoffs. The National Book Review called it "end of a romantic era in international journalism". Archives The archives of the International Herald Tribune, all the articles from 1887 until 2013, were sold or licensed to the Gale company, where they began appearing in 2017. This material is not available from any New York Times archive. The New York Times website does, however, host a very limited selection of "retrospective" stories from the 1887–2013 years, a collection that became available in 2017, the same year that the full archives became available on Gale. References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/ResearchGate] | [TOKENS: 2740] |
Contents ResearchGate ResearchGate is a commercial social networking site for scientists and researchers to share papers, ask and answer questions, and find collaborators. According to a 2014 study by Nature and a 2016 article in Times Higher Education, it is the largest academic social network in terms of active users, although other services have more registered users, and a 2015–2016 survey suggests that almost as many academics have Google Scholar profiles. While reading articles does not require registration, people who wish to become site members need to have an email address at a recognized institution or to be manually confirmed as a published researcher in order to sign up for an account. Articles are free to read by visitors, however additional features (such as job postings or advertisements) are accessible only as a paid subscription. Members of the site each have a user profile and can upload research output including papers, data, chapters, negative results, patents, research proposals, methods, presentations, and software source code. Users may also follow the activities of other users and engage in discussions with them. Users are also able to block interactions with other users. The site has been criticized for sending unsolicited email invitations to coauthors of the articles listed on the site that were written to appear as if the email messages were sent by the other coauthors of the articles (a practice the site said it had discontinued as of November 2016) and for automatically generating apparent profiles for non-users who have sometimes felt misrepresented by them. A study found that over half of the uploaded papers appear to infringe copyright, because the authors uploaded the publisher's version. Features The New York Times described the site as a mashup of Facebook, Twitter, and LinkedIn. Site members may follow a research interest, in addition to following other individual members. It has a blogging feature for users to write short reviews on peer-reviewed articles. ResearchGate indexes self-published information on user profiles to suggest members to connect with others who have similar interests. When a member posts a question, it is fielded to others that have identified on their user profile that they have a relevant expertise. It also has private chat rooms where users can share data, edit shared documents, or discuss confidential topics. The site also features a research-focused job board. As of 2020[update], it has more than 17 million users, with its largest user-bases coming from Europe and North America. Most of ResearchGate's users are involved in medicine or biology, though it also has participants from engineering, law, computer science, agricultural sciences, and psychology, among others. ResearchGate published an author-level metric in the form of an "RG Score" since 2012. RG score is not a citation impact measure. RG Scores have been reported to be correlated with existing author-level metrics, but have also been criticized as having questionable reliability and an unknown calculation methodology. In March 2022 ResearchGate announced they would remove the RG Score after July 2022. ResearchGate does not charge fees for putting content on the site and does not require peer review. History ResearchGate was founded in 2008 by virologist Ijad Madisch, who remains the company's CEO, with physician Sören Hofmayer, and computer scientist Horst Fickenscher. It started in Boston, Massachusetts, and moved to Berlin, Germany, shortly afterwards. The company's first round of funding, in 2010, was led by the venture capital firm Benchmark. Benchmark partner Matt Cohler became a member of the board and participated in the decision to move to Berlin. The website began with few features, and developed based on input from scientists. From 2009 to 2011, the number of users of the site grew from 25,000 to more than 1 million. A second round of funding led by Peter Thiel's Founders Fund was announced in February 2012. On June 4, 2013, it closed Series C financing arrangements for $35M from investors including Bill Gates. The company grew from 12 employees in 2011 to 120 in 2014. As of 2016, it had about 300 employees, including a sales staff of 100. ResearchGate's competitors include Academia.edu, Google Scholar, and Mendeley, as well as new competitors that emerged in the last decade like Semantic Scholar. In 2016, Academia.edu reportedly had more registered users (about 34 million versus 11 million) and higher web traffic, but ResearchGate was substantially larger in terms of active usage by researchers. The fact that ResearchGate restricts its user accounts to people at recognized institutions and published researchers may explain the disparity in active usage, as a high percentage of the accounts on Academia.edu are lapsed or inactive. In a 2015–2016 survey of academic profile tools, about as many respondents have ResearchGate profiles and Google Scholar profiles, but almost twice as many respondents use Google Scholar for search than use ResearchGate for accessing publications. Madisch has said the company's business strategy is focused on highly targeted advertising based on analysis of the activities of users, saying "Imagine you could click on a microscope mentioned in a paper and buy it", and estimating the spending on science at $1 trillion per year under the control of a "relatively small number of people". In November 2015 they acquired additional funding of $52.6 million from a range of investors including Goldman Sachs, Benchmark Capital, Wellcome Trust and Bill Gates, but did not announce this until February 2017. Losses increased from €5.4m in 2014 to €6.2m in 2015, but ResearchGate's CEO expressed optimism that they would break even eventually. ResearchGate, Elsevier and American Chemical Society settled their lawsuit on 15 September 2023. As of January 2023, ResearchGate has partnered with Sage to distribute open access content. Reception A 2009 article in BusinessWeek reported that ResearchGate was a "potentially powerful link" in promoting innovation in developing countries by connecting scientists from those nations with their peers in industrialized nations. It said the website had become popular largely due to its ease of use. It also said that ResearchGate had been involved in several notable cross-country collaborations between scientists that led to substantive developments. Academic reception of ResearchGate remains generally positive, as recent reviews of extant literature show an accepting audience with broad coverage of concepts. A 2012 paper published in The International Information & Library Review conducted a survey with 160 respondents and reported that out of those respondents using social networking "for academic purposes", Facebook and ResearchGate were the most popular at the University of Delhi, but also "a majority of respondents said using SNSs [Social Networking Sites] may be a waste of time". Although ResearchGate is used internationally, its uptake—as of 2014—is uneven, with Brazil having particularly many users and China having few when compared to the number of publishing researchers. In a 2014 study by Nature, 88 percent of the responding scientists and engineers said that they were aware of ResearchGate: Q1 and would use it when "contacted", but less than 10% said they would use it to actively discuss research with 40% instead preferring to use Twitter when discussing research. ResearchGate was visited regularly by half of those surveyed by Nature, coming second to Google Scholar. 29 percent of regular visitors had signed up for a profile on ResearchGate in the past year, and 35% of the survey participants were invited by email. A 2016 article in Times Higher Education reported that in a global survey of 20,670 people who use academic social networking sites, ResearchGate was the dominant network and was twice as popular as others: 61 percent of respondents who had published at least one paper had a ResearchGate profile. Another study reported that "relatively few academics appear to post questions and answers", but instead use it only as an "online CV". In the context of the big deal cancellations by several library systems in the world, the wide usage of ResearchGate was credited as one of the factors which reduced the apparent value of the subscriptions to toll access resources. Data analysis tools like Unpaywall Journals, used by libraries to calculate the real costs and value of their options before such decisions, allow to separate ResearchGate from open archives like institutional repositories, which are considered more stable. Criticism ResearchGate's decision to not remove convicted sex offenders from its social networking site has been criticized by Canadian authorities. Many researchers world-wide deleted their account in protest as they refused to remove convicted child pornographer and registered sex offender in Canada, Ben Levin as a user. Identified on ResearchGate as "Research Ben", he had been a frequent user of ResearchGate, publishing over 80 papers of interest with the vast majority dealing with studies around child pornography and pedophiles. ResearchGate has been criticized for emailing unsolicited invitations to the coauthors of its users.: Q2 These emails were written as if they were personally sent by the user, but were instead sent automatically unless the user opted out,: Q3 which caused some researchers to boycott the service: Q4 and contributes to the negative view of ResearchGate in the scientific community.: Q5, Q7 As of November 2016, the site appears to have discontinued this practice. The TechCrunch moderator Mike Butcher accused ResearchGate of having scraped competitors' websites for email addresses to spam, which the ResearchGate CEO denied. A study published by the Association for Information Systems in 2014 found that a dormant account on ResearchGate, using default settings, generated 297 invitations to 38 people over a 16-month period, and that the user profile was automatically attributed to more than 430 publications. Furthermore, journalists and researchers found that the RG score, calculated by ResearchGate via a proprietary algorithm, can reach high values under questionable circumstances. Several studies have looked at the RG score, for which details about how it is calculated are not published. These studies concluded that the RG score was "intransparent and irreproducible", criticized the way it incorporates the journal impact factor into the user score, and suggested that it should "not be considered in the evaluation of academics". The results were confirmed in a second "response" study, which also found the score to depend mostly on journal impact factors. The RG score was found to be negatively correlated with network centrality, i.e., that users that are the most active (and thus central to the network) on ResearchGate usually do not have high RG scores. It was also found to be strongly positively correlated with Quacquarelli Symonds university rankings at the institutional level, but only weakly with Elsevier SciVal rankings of individual authors. While it was found to be correlated with different university rankings, the correlation in between these rankings themselves was higher. Nature also reported that "Some of the apparent profiles on the site are not owned by real people, but are created automatically – and incompletely – by scraping details of people's affiliations, publication records and PDFs, if available, from around the web. That annoys researchers who do not want to be on the site, and who feel that the pages misrepresent them – especially when they discover that ResearchGate will not take down the pages when asked.": Q6, Q7 ResearchGate uses a crawler to find PDF versions of articles on the homepages of authors and publishers.: Q6 These are then presented as if they had been uploaded to the web site by the author:: Q7, Q8 the PDF will be displayed embedded in a frame, and only the button label "External Download" indicates that the file was in fact not uploaded to ResearchGate.[citation needed] ResearchGate has been criticized for failing to provide safeguards against "the dark side of academic writing", including such phenomena as fake publishers, "ghost journals", publishers with "predatory" publication fees, and fake impact ratings. It has also been criticized for copyright infringement of published works. In September 2017, lawyers representing the International Association of Scientific, Technical, and Medical Publishers (STM) sent a letter to ResearchGate threatening legal action against them for copyright infringement and demanding that they alter their handling of uploaded articles to include pre-release checking for copyright violations and "Specifically, [for ResearchGate to] end its extraction of content from hosted articles and the modification of any hosted content, including any and all metadata. It would also mean an end to Researchgate's own copying and downloading of published journal article content and the creation of internal databases of articles." This was followed by an announcement that takedown requests are to be issued to ResearchGate for copyright infringement relating to millions of articles. A statement supporting the action was issued by a group called Coalition for Responsible Sharing, and the statement was signed by the American Chemical Society, Brill Publishers, Elsevier, Wiley, and Wolters Kluwer. Subsequently, Coalition for Responsible Sharing (CfRS) reported that "ResearchGate has removed from public view a significant number of copyrighted articles it is hosting on its site". CfRS also confirmed that "not all violations have been addressed" and as such, takedown notices have been issued. ResearchGate has managed to achieve an agreement on article uploading with three other major publishers, Springer Nature, Cambridge University Press and Thieme. Under the agreement, the publishers will be notified when their articles are uploaded but will not be able to premoderate uploads. References External links |
======================================== |
[SOURCE: https://www.wired.com/story/one-womans-mission-to-rewrite-nazi-history-wikipedia/] | [TOKENS: 10213] |
Noam CohenThe Big StorySep 7, 2021 7:00 AMOne Woman’s Mission to Rewrite Nazi History on WikipediaKsenia Coffman’s fellow editors have called her a vandal and a McCarthyist. She just wants them to stop glorifying fascists—and start citing better sources.Ksenia Coffman in her neighborhood in San Jose, California.Photograph: Talia HermanCommentLoaderSave StorySave this storyCommentLoaderSave StorySave this storyWhen Ksenia Coffman started editing Wikipedia, she was like a tourist in Buenos Aires in the 1950s. She came to learn the tango, admire the architecture, sip maté. She didn’t know there was a Nazi problem. But Coffman, who was born in Soviet-era Russia and lives in Silicon Valley, is an intensely observant traveler. As she link-hopped through articles about the Second World War, one of her favorite subjects, she saw what seemed like a concerted effort to look the other way about Germany’s wartime atrocities.Coffman can’t recall exactly when her concern set in. Maybe it was when she read the article about the SS, the Nazi Party’s paramilitary, which included images that felt to her like glamour shots—action-man officers admiring maps, going on parade, all sorts of “very visually disturbing” stuff. Or maybe it was when she clicked through some of the pages about German tank gunners, flying aces, and medal winners. There were hundreds of them, and the men’s impressive kill counts and youthful derring-do always seemed to exist outside the genocidal Nazi cause. What was going on here? Wikipedia was supposed to be all about consensus. Wasn’t there consensus on, you know, Hitler?A typical person might have thought, Something is wrong on the internet again. What a bummer. Next tab. But Coffman is the person who finishes the thousand-page Holocaust novel. Whatever she chooses to spend her time on—powerlifting, fragrance collecting, denazification—she approaches the assignment like a straight-A student. You can time-travel back and watch her begin. Wikipedia never forgets; it keeps a permanent public record of every change an editor makes.In early November 2015, you will find K.e.coffman in “20 July plot,” an article about the failed plan by German officers to assassinate Hitler. A sentence has jumped out at her. It says that some of the conspirators came to see the plot as “a grand, if futile gesture” that would save “the honour of themselves, their families, the army and Germany.” The claim isn’t supported by any sources. It’s conjecture, hearsay. And to her it seems strangely flattering.Coffman navigates over to the Wikipedia article about one of the conspirators—Arthur Nebe, a high-ranking member of the SS. Apart from his role in the plot, Nebe’s main claim to notability is that he came up with the idea of turning vans into mobile gas chambers by piping in exhaust fumes. The article acknowledges both of these facts, along with the detail that Nebe tested his system on the mentally ill. But it also says that he worked to “reduce the atrocities committed,” going so far as to give his bloodthirsty superiors inflated death totals.This article appears in the November 2021 issue. Subscribe to WIRED. Coffman will recall that she feels “totally disoriented.” She cannot believe that an innovator in mass murder would have tried to protect the Jews and other supposed subhumans his troops rounded up. She checks the footnotes. The claim is attributed to War of Extermination, a compendium of academic essays originally published in 1995.Coffman knows the book is legit, because she happens to have a copy on loan from the library. When she goes to the cited page, she finds a paragraph that appears to confirm all the Wikipedia article’s wild claims. But then she reads the first sentence of the next paragraph: “This is, of course, nonsense.”The level of bad faith is eye-opening for Coffman. She is “very appalled.” She sees that her confidence in Wikipedia was “very much misplaced.” All it takes to warp historical memory, she realizes, is something this small, achievable for almost anyone with a keyboard. “So few people can have so much impact, it’s a little scary,” she says. She begins to turn a more critical eye to what she sees on Wikipedia. Especially the footnotes.In a long spree of edits, Coffman cleans up the two articles. She goes to the Talk page for “20 July plot,” where editors debate changes to the main article. She copy-pastes the language about the grand, futile gesture. “I would like to remove this part,” she writes. “Thoughts? Objections?” Another editor voices support. With a click, the paragraph is gone.In the Nebe article, Coffman adds a “[citation needed]” tag to the flagrantly false claim. She identifies two more dubious sources—one misleadingly quoted, one potentially invented. She checks out a book called The SS: Alibi of a Nation to make sure. Over and over again, she reworks Nebe’s legacy: At first, it’s that some historians “have a much harsher view” of him than others. Then it’s that they “have a less generous view.” Then it’s “Historians have a negative view of Nebe and his motivations, despite his participation in the 20 July plot.” Coffman is beginning to understand that history is an edit war. Truth, factual and moral, hangs in the balance.Similar battles over how to remember the past have been raging across society. Do we let the old bronze statues stand in our boulevards, or do we put them in a museum someplace, or do we melt them down? Can there be a “hero” who fought for a morally rotten cause? Are qualities like valor and self-sacrifice and tactical brilliance worth admiring anywhere they occur, even if, say, racial supremacism is there too? Some choose to take to the streets. Coffman fights on the terrain most familiar to her, with the weapons she knows best. Not that she would put it that way; she’s not big on war metaphors.Coffman at home. Photograph: Talia HermanSeveral weeks into her new obsession, Coffman realizes that she’s supposed to fill out her User page—the Wikipedia equivalent of a profile, where editors broadcast opinions, grudges, achievements, pet peeves. One Saturday night she updates it for the first time. “I’m a new editor to Wikipedia,” she writes. “I enjoy contributing and engaging with other editors.”An hour later, past midnight, she adds: “My editing style tends to be bold.”Coffman was raised by engineers in the waning days of the Soviet Union. She had what she describes as a “culturally privileged upbringing” in Moscow. She went to galleries, museums, the theater. In her neighborhood, she remembers fondly, there was a recycling kiosk that rewarded you with literature. “For this number of kilos of paper you could get these books,” she says. “Classics: Pushkin, Tolstoy. Reading was encouraged.”She wasn’t taught to romanticize the war. “The martial qualities of the veterans were never celebrated,” Coffman says. “It wasn’t about the glorious victories, fighters zooming down on enemy ships.” Her grandfather, a soil scientist, served in the Red Army as a combat engineer and survived the assault on Leningrad. But in typical fashion, she says, she heard next to nothing about his experiences when she was kid. (For the first time, in response to questions for this article, Coffman asked her father what he knew. She reported back that at one point her grandfather had considered suicide. “The only thing that prevented him from doing this was the thought that he had to get back to his wife and kids,” she wrote.)At university, Coffman majored in computational linguistics, a field that combined her interests in language and science. She was a top student and won a scholarship for business school in the Bay Area. She arrived during the dotcom boom and never left. “When I moved to the US, I didn’t have this idea of the shining beacon of democracy,” she says. But at least she could feel safe. “I would walk down the street and the police officer wouldn’t assault me or ask me for a bribe.”Coffman, who has broad shoulders under a bob of blond hair, thinks and talks deliberately. She lives in a compact townhouse in a planned community in San Jose, California. The museums and galleries are harder to get to now (“I have to drive to San Francisco, find parking”), but she keeps stimulated with books, hobbies, and books about her hobbies. When I visited her at home earlier this year, she walked me past the weight-lifting setup on the ground floor. (She read Starting Strength: Basic Barbell Training for that one. She approves of the book because it’s “like a science manual.”) Upstairs, I recognized the tall, narrow bookshelf that appears behind her during Zoom calls. It contains dozens of titles that wouldn’t look out of place in a history grad student’s apartment—Hitler’s Generals on Trial. Kiev: 1941. Soldaten: On Fighting, Killing, and Dying. A few others, like In the Company of Women, nod to a career in business.Ksenia Coffman's bookshelf in her San Jose home.Photograph: Talia HermanThe Second World War is where Coffman feels most comfortable, but in 2015, she says, she got interested in the US Civil War. That summer, a young white supremacist murdered nine congregants at a Black church in Charleston, South Carolina. The shooting made her realize, she says, that “there was all this other America” that lay beyond her experience—a place deeply scarred by a past she barely understood.So Coffman did what she always did: She read. And because she happened to be between jobs, she was free to immerse herself in history for long stretches. She learned about the Civil War, the conflict behind so much of the turmoil in the United States. She read about “lost cause” ideology, which claims the Confederacy actually fought to preserve high-minded Southern ideals, not specifically the institution of slavery. She brushed up on her knowledge of the Second World War, a struggle more familiar to her.Maybe the lack of a job, of people to collaborate with, is also what made Wikipedia seem like an attractive pastime. That’s what it was supposed to be: another hobby. At first, Coffman stuck to tentative, sporadic suggestions. But then she was making edits nearly every day; there was so much to fix. She liked the site’s intricate bureaucracy—the guidelines on etiquette and reliable sourcing, the policies on dispute resolution and article deletion, the learned essays and discussion pages that editors cite like case law. “Wikipedia is very regimented,” she says. “I am good with instructions.”“G’day,” Peacemaker67 begins his note for K.e. coffman. It’s late 2015, and he is concerned about recent changes to an article on Wikipedia (“WP” for short) about an SS tank division made up of Nordic Nazi volunteers. “Sorry but there appears to be some sort of misunderstanding about what should be deleted on WP, and I just want to clarify it before this gets too far down the track.”Coffman recognizes this editor’s handle. He’s Australian, and his User page says he served as a peacekeeper in the former Yugoslavia. He is the same person who invited her to join WikiProject Military History, a group where editors can chat, take classes, win plaudits, and work on articles together.Not for the first time, Coffman has been removing material from the article about the tank division. She thinks it’s full of unsourced fancruft, the Wikipedia word for fawning, excessively detailed descriptions that appeal to a tiny niche of readers—in this case, those thrilled by accounts of battle. The article tells how “the division acquitted itself well” even against “stiffening resistance,” how it “held the line” and earned the “grudging respect” of skeptical commanders. One contributor has used the eyebrow-raising phrase “baptism of fire.” It’s as if the editors don’t see the part lower down the page where a soldier uses the phrase “and then we cleaned a Jew hole.”The glorifying language, Coffman thinks, is a clear sign that this is historical fan fiction. It elides the horrors of war. If editors want such details to stay on the page, at a minimum they should use a better source than Axis History, a blog whose motto is “Information not shared is lost.”The interaction starts out politely enough. “IMHO it is good that you are deleting citations from unreliable bloggy sources,” Peacemaker67 says. “But just because material is sourced to them doesn’t mean it is wrong.”K.e.coffman replies in less than an hour. “Thank you for your note,” she writes. “Yes, I was surprised about how little I was able to salvage as I was editing the article.” She lists 17 bullet-pointed examples of biased language, Nazi glorification, and unreliable claims. “Would Wikipedia not be better without such content?” she asks.“Well, people are on WP for different reasons,” Peacemaker67 replies. “I don’t go around deleting stuff because I think it might be dodgy.” He cites a page that counsels gradualism in editing, because Wikipedia is a work in progress. “Articles have long histories, and there is no WP:DEADLINE,” he says.Coffman cites a different doctrine in response. “I’m of the view that there’s indeed a deadline: Wikipedia:The deadline is now,” she writes. “Why perpetuate misinformation when it can be removed, or give legitimacy to glorification while there are already plenty of sites that do that? I believe Wikipedia’s standards to be higher.”Peacemaker67’s final response, nine minutes later, is curt: “If you take this sort of action on articles on my watchlist, expect to be reverted and asked to provide reliable sources that contradict what is in the article.”Like other editors whom Coffman will encounter, Peacemaker67 sees something pernicious in her work. In a recent email, he told me that he considers Coffman’s approach “most unencyclopaedic and a prime example of what Wikipedia is not (see WP:NOTCENSORED).” He went on: “Will we apply the same censorship to military history articles on units of the Khmer Rouge? Turkish military units involved in the Armenian Genocide? Rwandan military units involved in the genocide in that country? US cavalry units that massacred Native Americans? Arkan’s Tigers? Where does that end?”Coffman finds her next target in the footnotes of the article about the tank division. This one’s name is Franz Kurowski, and he seems to pop up all over the place. Kurowski served in the Luftwaffe. After the war, he tried his hand at all sorts of popular writing, often with a pseudonym to match: Jason Meeker and Slade Cassidy for his crime fiction and westerns, Johanna Schulz and Gloria Mellina for his chick lit. But his accounts of the Second World War made him famous under his own name. Kurowski’s stories weren’t subtle. As the German historian Roman Töppel writes in a critical essay: “They depict war as a test of fate and partly as adventure. German war crimes are left out—much unlike allied war crimes.”To understand this dubious chronicler better, Coffman goes to Google, where she comes upon a book called The Myth of the Eastern Front. It describes how, in the immediate aftermath of the war, characters like Kurowski worked to rehabilitate the image of the German army—to argue that a few genocidal apples had spoiled the barrel. With a guy like Hitler to pin the blame on, the rest was easy. The so-called “myth of the clean Wehrmacht” took root on both sides of the Atlantic: German society needed to believe that not everyone who wore a gray uniform was evil, and the Americans were courting every anti-Communist ally they could find. Then, in the mid-1990s, a museum exhibit cataloging the crimes of the Nazi-era military traveled throughout Germany. An odd situation emerged: Germans began to speak more honestly about the Wehrmacht than non-Germans did.When Coffman reads this, something clicks. She is dealing with a poisonous tree here. She shouldn’t be throwing out individual pieces of fruit. She should be chopping it off at the trunk. She starts to pivot from history (the facts themselves) to historiography (the way they’re gathered). She begins to use Wikipedia to document the false historical narrative, and its purveyors, and then make the fight about dubious sources rather than specific articles.On Christmas Eve, she returns to Arthur Nebe’s page and makes a one-word addition: “Historians have a uniformly negative view of Nebe and his motives.”In the spring of 2016, Coffman goes through hundreds of articles about the winners of various Nazi medals, including one called the Knight’s Cross of the Iron Cross. She removes biased sources and any information based on those sources. When she is done, typically, there is nothing left to the article—nothing to say about the person—other than the fact that he won an award. She then insists that an award isn’t reason enough for a stand-alone Wikipedia article. Without a reliable source telling your life story, you can’t be notable. Poof. Another Nazi legend bites the dust.A particularly revered medal winner, or a high-ranking one, might survive Coffman’s purge. But the results aren’t pretty. When she arrives at Kurt Knispel’s page, it says that he was “one of the, if not the, greatest tank ace of all time.” His photo shows a young gunner with shaggy blond hair and a goatee. He flashes a smile, unaware that he is doomed.Unfortunately for Knispel, his reputation rests almost entirely on stories told by Kurowski, as well as an account in the Wehrmachtbericht, the Nazi propaganda broadcast. Coffman strips away the apocryphal stories of action and adventure, like the one that says Knispel was held back from promotions because he assaulted a superior. When she’s done, the article is reduced to four paragraphs, three of which relate to his death, at age 23, when he was struck by a Soviet tank. Later, someone will leave a short, sad note on the article’s Talk page: “There used to be a lot of information here about his military career, unconventional attitude to military discipline etc. … Why has it been deleted?”Coffman’s edits have jumped from 1,400 a month to 5,000. She is entering her most prolific period. She has been filling her User page with study guides and research, but now her tone gets bolder, punchier. The names of the sections go from dry (“Waffen-SS revisionism”) to cheerfully contemptuous (“High Moral Fiber Sub-department”). The page is becoming a sprawling tongue-in-cheek taxonomy of her obsession—and the parapet from which she taunts her adversaries.“G’day,” she reads in a note in the summer of 2016. It’s Peacemaker67 again, back with one last warning. “I’ve noticed that you have been nominating articles on Knight’s Cross of the Iron Cross recipients for deletion, after you have deleted significant amounts of text and possible sources from them,” he writes. “That type of behaviour is deplorable, and not appropriate on en WP.” (Coffman’s detractors often imply that she doesn’t fit in on “en WP,” or English Wikipedia. They often assume that she is a visitor from German Wikipedia, “de WP,” because of her insistence on holding the Wehrmacht to account.) “I suggest you stop,” Peacemaker67 concludes. “Cheers.”They go back and forth again. Eventually, Coffman appeals to the broader Wikipedia community to decide who is right about the notability of these medal winners. “The issue appears to be complex, so I would appreciate further input,” she writes. The debate hinges on certain policy wordings, along with the question of how to compare military awards from France, the US, Great Britain, and Germany. The WikiProject Military History members are well represented, but Coffman picks up crucial support. A user called MaxRavenclaw objects to the claim that purging Iron Cross winners is a form of “victor’s justice”: “You should know that history is written by the literate, not the victors. You can’t expect anyone to take you seriously when you make such statements.”The fight rages across pages for months. In the fall, Peacemaker67 writes that he is “frankly sick to death” of K.e.coffman’s “ongoing campaign.” It is “detracting from the enjoyment of the volunteer editors who actually contribute to this encyclopaedia,” he writes. A careful reader of his cri de coeur will note that he assumes Coffman to be a man (“Community norms rule on WP, not his personal views”). This is a common misimpression among the Military History gang. Coffman never tries to correct it.Coffman on a hill above her neighborhood. Photograph: Talia HermanAfter six months of debate, on January 22, 2017, Coffman is vindicated. An administrator leaves a note steeped in Wikipedia reasoning. “In the case of the Knight’s Cross the community has established a consensus,” it concludes. “Sufficient reliable sources are lacking for many recipients.” In other words, there should be no presumption that winning a Knight’s Cross of the Iron Cross makes you notable enough for a Wikipedia article. The only thing you’re guaranteed is a one-line spot on a long list of winners.Coffman keeps track of the accusations against her—“campaigning,” “forum-shopping,” “not dropping the stick.”After the case is settled, Coffman and her more vocal opponents retreat to separate corners. But one bitter-ender, LargelyRecyclable, appears to create a troll account and continues objecting to her changes. She finally takes the user to the Arbitration Committee, English Wikipedia’s version of the Supreme Court.The panel doesn’t wade into the specifics, writing explicitly that “it is not the role of the Arbitration Committee to settle good-faith content disputes among editors.” But what it does rule gives Coffman a feeling of support, she says. LargelyRecyclable is banned indefinitely from editing English Wikipedia. The ArbCom also notes that groups like WikiProject Military History “do not have any authority over article content or editor conduct, or any other special powers.” They can accuse Coffman of whatever they like—vandalism, McCarthyism, “deletionist zeal.” She has just as much right to edit history as they do.And few can match her output: 97,000 edits, 3,200 pages created, countless debates argued and won. Today, K.e.coffman is a solid member of English Wikipedia’s editorial elite—No. 734 out of 121,000, as of this writing. She keeps a watch list with about 2,000 articles on it. A notification pops up next to the listing whenever someone tries to make a change. That’s the thing about edit wars: They never end.But Coffman, of course, avoids martial language. Wikipedia isn’t a battlefield; it’s real estate. “You have to maintain your house,” she says. “You have to have a security system.”On her User page now, there are sections called “Nazi fancruft” and “Apocryphal nicknames.” There are lists of apologist sources and right-wing publishers. There is an entire offshoot page called “My allegedly problematic behaviour,” where she keeps track of the accusations against her—“campaigning,” “forum-shopping,” “not dropping the stick.” She has even given herself an award for all her heroic work: the Vandal’s Cross of Iron Cross with Swords and Diamonds.Let us know what you think about this article. Submit a letter to the editor at mail@wired.com. One Woman’s Mission to Rewrite Nazi History on Wikipedia When Ksenia Coffman started editing Wikipedia, she was like a tourist in Buenos Aires in the 1950s. She came to learn the tango, admire the architecture, sip maté. She didn’t know there was a Nazi problem. But Coffman, who was born in Soviet-era Russia and lives in Silicon Valley, is an intensely observant traveler. As she link-hopped through articles about the Second World War, one of her favorite subjects, she saw what seemed like a concerted effort to look the other way about Germany’s wartime atrocities. Coffman can’t recall exactly when her concern set in. Maybe it was when she read the article about the SS, the Nazi Party’s paramilitary, which included images that felt to her like glamour shots—action-man officers admiring maps, going on parade, all sorts of “very visually disturbing” stuff. Or maybe it was when she clicked through some of the pages about German tank gunners, flying aces, and medal winners. There were hundreds of them, and the men’s impressive kill counts and youthful derring-do always seemed to exist outside the genocidal Nazi cause. What was going on here? Wikipedia was supposed to be all about consensus. Wasn’t there consensus on, you know, Hitler? A typical person might have thought, Something is wrong on the internet again. What a bummer. Next tab. But Coffman is the person who finishes the thousand-page Holocaust novel. Whatever she chooses to spend her time on—powerlifting, fragrance collecting, denazification—she approaches the assignment like a straight-A student. You can time-travel back and watch her begin. Wikipedia never forgets; it keeps a permanent public record of every change an editor makes. In early November 2015, you will find K.e.coffman in “20 July plot,” an article about the failed plan by German officers to assassinate Hitler. A sentence has jumped out at her. It says that some of the conspirators came to see the plot as “a grand, if futile gesture” that would save “the honour of themselves, their families, the army and Germany.” The claim isn’t supported by any sources. It’s conjecture, hearsay. And to her it seems strangely flattering. Coffman navigates over to the Wikipedia article about one of the conspirators—Arthur Nebe, a high-ranking member of the SS. Apart from his role in the plot, Nebe’s main claim to notability is that he came up with the idea of turning vans into mobile gas chambers by piping in exhaust fumes. The article acknowledges both of these facts, along with the detail that Nebe tested his system on the mentally ill. But it also says that he worked to “reduce the atrocities committed,” going so far as to give his bloodthirsty superiors inflated death totals. This article appears in the November 2021 issue. Subscribe to WIRED. Coffman will recall that she feels “totally disoriented.” She cannot believe that an innovator in mass murder would have tried to protect the Jews and other supposed subhumans his troops rounded up. She checks the footnotes. The claim is attributed to War of Extermination, a compendium of academic essays originally published in 1995. Coffman knows the book is legit, because she happens to have a copy on loan from the library. When she goes to the cited page, she finds a paragraph that appears to confirm all the Wikipedia article’s wild claims. But then she reads the first sentence of the next paragraph: “This is, of course, nonsense.” The level of bad faith is eye-opening for Coffman. She is “very appalled.” She sees that her confidence in Wikipedia was “very much misplaced.” All it takes to warp historical memory, she realizes, is something this small, achievable for almost anyone with a keyboard. “So few people can have so much impact, it’s a little scary,” she says. She begins to turn a more critical eye to what she sees on Wikipedia. Especially the footnotes. In a long spree of edits, Coffman cleans up the two articles. She goes to the Talk page for “20 July plot,” where editors debate changes to the main article. She copy-pastes the language about the grand, futile gesture. “I would like to remove this part,” she writes. “Thoughts? Objections?” Another editor voices support. With a click, the paragraph is gone. In the Nebe article, Coffman adds a “[citation needed]” tag to the flagrantly false claim. She identifies two more dubious sources—one misleadingly quoted, one potentially invented. She checks out a book called The SS: Alibi of a Nation to make sure. Over and over again, she reworks Nebe’s legacy: At first, it’s that some historians “have a much harsher view” of him than others. Then it’s that they “have a less generous view.” Then it’s “Historians have a negative view of Nebe and his motivations, despite his participation in the 20 July plot.” Coffman is beginning to understand that history is an edit war. Truth, factual and moral, hangs in the balance. Similar battles over how to remember the past have been raging across society. Do we let the old bronze statues stand in our boulevards, or do we put them in a museum someplace, or do we melt them down? Can there be a “hero” who fought for a morally rotten cause? Are qualities like valor and self-sacrifice and tactical brilliance worth admiring anywhere they occur, even if, say, racial supremacism is there too? Some choose to take to the streets. Coffman fights on the terrain most familiar to her, with the weapons she knows best. Not that she would put it that way; she’s not big on war metaphors. Coffman at home. Several weeks into her new obsession, Coffman realizes that she’s supposed to fill out her User page—the Wikipedia equivalent of a profile, where editors broadcast opinions, grudges, achievements, pet peeves. One Saturday night she updates it for the first time. “I’m a new editor to Wikipedia,” she writes. “I enjoy contributing and engaging with other editors.” An hour later, past midnight, she adds: “My editing style tends to be bold.” Coffman was raised by engineers in the waning days of the Soviet Union. She had what she describes as a “culturally privileged upbringing” in Moscow. She went to galleries, museums, the theater. In her neighborhood, she remembers fondly, there was a recycling kiosk that rewarded you with literature. “For this number of kilos of paper you could get these books,” she says. “Classics: Pushkin, Tolstoy. Reading was encouraged.” She wasn’t taught to romanticize the war. “The martial qualities of the veterans were never celebrated,” Coffman says. “It wasn’t about the glorious victories, fighters zooming down on enemy ships.” Her grandfather, a soil scientist, served in the Red Army as a combat engineer and survived the assault on Leningrad. But in typical fashion, she says, she heard next to nothing about his experiences when she was kid. (For the first time, in response to questions for this article, Coffman asked her father what he knew. She reported back that at one point her grandfather had considered suicide. “The only thing that prevented him from doing this was the thought that he had to get back to his wife and kids,” she wrote.) At university, Coffman majored in computational linguistics, a field that combined her interests in language and science. She was a top student and won a scholarship for business school in the Bay Area. She arrived during the dotcom boom and never left. “When I moved to the US, I didn’t have this idea of the shining beacon of democracy,” she says. But at least she could feel safe. “I would walk down the street and the police officer wouldn’t assault me or ask me for a bribe.” Coffman, who has broad shoulders under a bob of blond hair, thinks and talks deliberately. She lives in a compact townhouse in a planned community in San Jose, California. The museums and galleries are harder to get to now (“I have to drive to San Francisco, find parking”), but she keeps stimulated with books, hobbies, and books about her hobbies. When I visited her at home earlier this year, she walked me past the weight-lifting setup on the ground floor. (She read Starting Strength: Basic Barbell Training for that one. She approves of the book because it’s “like a science manual.”) Upstairs, I recognized the tall, narrow bookshelf that appears behind her during Zoom calls. It contains dozens of titles that wouldn’t look out of place in a history grad student’s apartment—Hitler’s Generals on Trial. Kiev: 1941. Soldaten: On Fighting, Killing, and Dying. A few others, like In the Company of Women, nod to a career in business. The Second World War is where Coffman feels most comfortable, but in 2015, she says, she got interested in the US Civil War. That summer, a young white supremacist murdered nine congregants at a Black church in Charleston, South Carolina. The shooting made her realize, she says, that “there was all this other America” that lay beyond her experience—a place deeply scarred by a past she barely understood. So Coffman did what she always did: She read. And because she happened to be between jobs, she was free to immerse herself in history for long stretches. She learned about the Civil War, the conflict behind so much of the turmoil in the United States. She read about “lost cause” ideology, which claims the Confederacy actually fought to preserve high-minded Southern ideals, not specifically the institution of slavery. She brushed up on her knowledge of the Second World War, a struggle more familiar to her. Maybe the lack of a job, of people to collaborate with, is also what made Wikipedia seem like an attractive pastime. That’s what it was supposed to be: another hobby. At first, Coffman stuck to tentative, sporadic suggestions. But then she was making edits nearly every day; there was so much to fix. She liked the site’s intricate bureaucracy—the guidelines on etiquette and reliable sourcing, the policies on dispute resolution and article deletion, the learned essays and discussion pages that editors cite like case law. “Wikipedia is very regimented,” she says. “I am good with instructions.” “G’day,” Peacemaker67 begins his note for K.e. coffman. It’s late 2015, and he is concerned about recent changes to an article on Wikipedia (“WP” for short) about an SS tank division made up of Nordic Nazi volunteers. “Sorry but there appears to be some sort of misunderstanding about what should be deleted on WP, and I just want to clarify it before this gets too far down the track.” Coffman recognizes this editor’s handle. He’s Australian, and his User page says he served as a peacekeeper in the former Yugoslavia. He is the same person who invited her to join WikiProject Military History, a group where editors can chat, take classes, win plaudits, and work on articles together. Not for the first time, Coffman has been removing material from the article about the tank division. She thinks it’s full of unsourced fancruft, the Wikipedia word for fawning, excessively detailed descriptions that appeal to a tiny niche of readers—in this case, those thrilled by accounts of battle. The article tells how “the division acquitted itself well” even against “stiffening resistance,” how it “held the line” and earned the “grudging respect” of skeptical commanders. One contributor has used the eyebrow-raising phrase “baptism of fire.” It’s as if the editors don’t see the part lower down the page where a soldier uses the phrase “and then we cleaned a Jew hole.” The glorifying language, Coffman thinks, is a clear sign that this is historical fan fiction. It elides the horrors of war. If editors want such details to stay on the page, at a minimum they should use a better source than Axis History, a blog whose motto is “Information not shared is lost.” The interaction starts out politely enough. “IMHO it is good that you are deleting citations from unreliable bloggy sources,” Peacemaker67 says. “But just because material is sourced to them doesn’t mean it is wrong.” K.e.coffman replies in less than an hour. “Thank you for your note,” she writes. “Yes, I was surprised about how little I was able to salvage as I was editing the article.” She lists 17 bullet-pointed examples of biased language, Nazi glorification, and unreliable claims. “Would Wikipedia not be better without such content?” she asks. “Well, people are on WP for different reasons,” Peacemaker67 replies. “I don’t go around deleting stuff because I think it might be dodgy.” He cites a page that counsels gradualism in editing, because Wikipedia is a work in progress. “Articles have long histories, and there is no WP:DEADLINE,” he says. Coffman cites a different doctrine in response. “I’m of the view that there’s indeed a deadline: Wikipedia:The deadline is now,” she writes. “Why perpetuate misinformation when it can be removed, or give legitimacy to glorification while there are already plenty of sites that do that? I believe Wikipedia’s standards to be higher.” Peacemaker67’s final response, nine minutes later, is curt: “If you take this sort of action on articles on my watchlist, expect to be reverted and asked to provide reliable sources that contradict what is in the article.” Like other editors whom Coffman will encounter, Peacemaker67 sees something pernicious in her work. In a recent email, he told me that he considers Coffman’s approach “most unencyclopaedic and a prime example of what Wikipedia is not (see WP:NOTCENSORED).” He went on: “Will we apply the same censorship to military history articles on units of the Khmer Rouge? Turkish military units involved in the Armenian Genocide? Rwandan military units involved in the genocide in that country? US cavalry units that massacred Native Americans? Arkan’s Tigers? Where does that end?” Coffman finds her next target in the footnotes of the article about the tank division. This one’s name is Franz Kurowski, and he seems to pop up all over the place. Kurowski served in the Luftwaffe. After the war, he tried his hand at all sorts of popular writing, often with a pseudonym to match: Jason Meeker and Slade Cassidy for his crime fiction and westerns, Johanna Schulz and Gloria Mellina for his chick lit. But his accounts of the Second World War made him famous under his own name. Kurowski’s stories weren’t subtle. As the German historian Roman Töppel writes in a critical essay: “They depict war as a test of fate and partly as adventure. German war crimes are left out—much unlike allied war crimes.” To understand this dubious chronicler better, Coffman goes to Google, where she comes upon a book called The Myth of the Eastern Front. It describes how, in the immediate aftermath of the war, characters like Kurowski worked to rehabilitate the image of the German army—to argue that a few genocidal apples had spoiled the barrel. With a guy like Hitler to pin the blame on, the rest was easy. The so-called “myth of the clean Wehrmacht” took root on both sides of the Atlantic: German society needed to believe that not everyone who wore a gray uniform was evil, and the Americans were courting every anti-Communist ally they could find. Then, in the mid-1990s, a museum exhibit cataloging the crimes of the Nazi-era military traveled throughout Germany. An odd situation emerged: Germans began to speak more honestly about the Wehrmacht than non-Germans did. When Coffman reads this, something clicks. She is dealing with a poisonous tree here. She shouldn’t be throwing out individual pieces of fruit. She should be chopping it off at the trunk. She starts to pivot from history (the facts themselves) to historiography (the way they’re gathered). She begins to use Wikipedia to document the false historical narrative, and its purveyors, and then make the fight about dubious sources rather than specific articles. On Christmas Eve, she returns to Arthur Nebe’s page and makes a one-word addition: “Historians have a uniformly negative view of Nebe and his motives.” In the spring of 2016, Coffman goes through hundreds of articles about the winners of various Nazi medals, including one called the Knight’s Cross of the Iron Cross. She removes biased sources and any information based on those sources. When she is done, typically, there is nothing left to the article—nothing to say about the person—other than the fact that he won an award. She then insists that an award isn’t reason enough for a stand-alone Wikipedia article. Without a reliable source telling your life story, you can’t be notable. Poof. Another Nazi legend bites the dust. A particularly revered medal winner, or a high-ranking one, might survive Coffman’s purge. But the results aren’t pretty. When she arrives at Kurt Knispel’s page, it says that he was “one of the, if not the, greatest tank ace of all time.” His photo shows a young gunner with shaggy blond hair and a goatee. He flashes a smile, unaware that he is doomed. Unfortunately for Knispel, his reputation rests almost entirely on stories told by Kurowski, as well as an account in the Wehrmachtbericht, the Nazi propaganda broadcast. Coffman strips away the apocryphal stories of action and adventure, like the one that says Knispel was held back from promotions because he assaulted a superior. When she’s done, the article is reduced to four paragraphs, three of which relate to his death, at age 23, when he was struck by a Soviet tank. Later, someone will leave a short, sad note on the article’s Talk page: “There used to be a lot of information here about his military career, unconventional attitude to military discipline etc. … Why has it been deleted?” Coffman’s edits have jumped from 1,400 a month to 5,000. She is entering her most prolific period. She has been filling her User page with study guides and research, but now her tone gets bolder, punchier. The names of the sections go from dry (“Waffen-SS revisionism”) to cheerfully contemptuous (“High Moral Fiber Sub-department”). The page is becoming a sprawling tongue-in-cheek taxonomy of her obsession—and the parapet from which she taunts her adversaries. “G’day,” she reads in a note in the summer of 2016. It’s Peacemaker67 again, back with one last warning. “I’ve noticed that you have been nominating articles on Knight’s Cross of the Iron Cross recipients for deletion, after you have deleted significant amounts of text and possible sources from them,” he writes. “That type of behaviour is deplorable, and not appropriate on en WP.” (Coffman’s detractors often imply that she doesn’t fit in on “en WP,” or English Wikipedia. They often assume that she is a visitor from German Wikipedia, “de WP,” because of her insistence on holding the Wehrmacht to account.) “I suggest you stop,” Peacemaker67 concludes. “Cheers.” They go back and forth again. Eventually, Coffman appeals to the broader Wikipedia community to decide who is right about the notability of these medal winners. “The issue appears to be complex, so I would appreciate further input,” she writes. The debate hinges on certain policy wordings, along with the question of how to compare military awards from France, the US, Great Britain, and Germany. The WikiProject Military History members are well represented, but Coffman picks up crucial support. A user called MaxRavenclaw objects to the claim that purging Iron Cross winners is a form of “victor’s justice”: “You should know that history is written by the literate, not the victors. You can’t expect anyone to take you seriously when you make such statements.” The fight rages across pages for months. In the fall, Peacemaker67 writes that he is “frankly sick to death” of K.e.coffman’s “ongoing campaign.” It is “detracting from the enjoyment of the volunteer editors who actually contribute to this encyclopaedia,” he writes. A careful reader of his cri de coeur will note that he assumes Coffman to be a man (“Community norms rule on WP, not his personal views”). This is a common misimpression among the Military History gang. Coffman never tries to correct it. Coffman on a hill above her neighborhood. After six months of debate, on January 22, 2017, Coffman is vindicated. An administrator leaves a note steeped in Wikipedia reasoning. “In the case of the Knight’s Cross the community has established a consensus,” it concludes. “Sufficient reliable sources are lacking for many recipients.” In other words, there should be no presumption that winning a Knight’s Cross of the Iron Cross makes you notable enough for a Wikipedia article. The only thing you’re guaranteed is a one-line spot on a long list of winners. Coffman keeps track of the accusations against her—“campaigning,” “forum-shopping,” “not dropping the stick.” After the case is settled, Coffman and her more vocal opponents retreat to separate corners. But one bitter-ender, LargelyRecyclable, appears to create a troll account and continues objecting to her changes. She finally takes the user to the Arbitration Committee, English Wikipedia’s version of the Supreme Court. The panel doesn’t wade into the specifics, writing explicitly that “it is not the role of the Arbitration Committee to settle good-faith content disputes among editors.” But what it does rule gives Coffman a feeling of support, she says. LargelyRecyclable is banned indefinitely from editing English Wikipedia. The ArbCom also notes that groups like WikiProject Military History “do not have any authority over article content or editor conduct, or any other special powers.” They can accuse Coffman of whatever they like—vandalism, McCarthyism, “deletionist zeal.” She has just as much right to edit history as they do. And few can match her output: 97,000 edits, 3,200 pages created, countless debates argued and won. Today, K.e.coffman is a solid member of English Wikipedia’s editorial elite—No. 734 out of 121,000, as of this writing. She keeps a watch list with about 2,000 articles on it. A notification pops up next to the listing whenever someone tries to make a change. That’s the thing about edit wars: They never end. But Coffman, of course, avoids martial language. Wikipedia isn’t a battlefield; it’s real estate. “You have to maintain your house,” she says. “You have to have a security system.” On her User page now, there are sections called “Nazi fancruft” and “Apocryphal nicknames.” There are lists of apologist sources and right-wing publishers. There is an entire offshoot page called “My allegedly problematic behaviour,” where she keeps track of the accusations against her—“campaigning,” “forum-shopping,” “not dropping the stick.” She has even given herself an award for all her heroic work: the Vandal’s Cross of Iron Cross with Swords and Diamonds. Let us know what you think about this article. Submit a letter to the editor at mail@wired.com. Comments You Might Also Like In your inbox: Maxwell Zeff's dispatch from the heart of AI ICE is expanding at breakneck speed—here’s where it’s going next Big Story: Inside the gay tech mafia Big Tech says AI will save the planet—it doesn’t offer much proof Event: Helping small business owners succeed © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://www.theverge.com/tesla/720157/tesla-death-lawsuit-verdict-lawyer-brett-schreiber-interview] | [TOKENS: 8716] |
TeslaCloseTeslaPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TeslaPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechThe lawyer who beat Tesla is ready for ‘round two’‘There are two Teslas,’ attorney Brett Schreiber told us. ‘There’s Tesla in the showroom and then there’s Tesla in the courtroom.’by Andrew J. HawkinsCloseAndrew J. HawkinsTransportation editorPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Andrew J. HawkinsAug 7, 2025, 11:00 AM UTCLinkShareGift Image: The Verge, Getty ImagesTeslaCloseTeslaPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TeslaPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechThe lawyer who beat Tesla is ready for ‘round two’‘There are two Teslas,’ attorney Brett Schreiber told us. ‘There’s Tesla in the showroom and then there’s Tesla in the courtroom.’by Andrew J. HawkinsCloseAndrew J. HawkinsTransportation editorPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Andrew J. HawkinsAug 7, 2025, 11:00 AM UTCLinkShareGiftAndrew J. HawkinsCloseAndrew J. HawkinsPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Andrew J. Hawkins is transportation editor with 10+ years of experience who covers EVs, public transportation, and aviation. His work has appeared in The New York Daily News and City & State.The day after he won an unprecedented $243 million verdict in a wrongful death case against Tesla, attorney Brett Schreiber posted a reel on Instagram celebrating the victory. His song pick: 1992’s “Damn It Feels Good To Be a Gangsta” by the Geto Boys.“This is a verdict that will change the world,” Schreiber wrote in the caption, as Bushwick Bill, Willie D, and Scarface rap in the background about how “everything’s cool in the mind of a gangsta.”If that sounds like hyperbole, mixed with a dose of macho boasting, you’re not wrong. But in some sense, Schreiber earned his right to strut.The day before posting the reel, he stood in a Florida courtroom alongside his clients as a jury handed Tesla a major defeat. The company was partially responsible — 33 percent, to be exact — for a 2019 crash that killed 22-year-old Naibel Benavides and seriously injured her boyfriend, Dillon Angulo. (It was determined that the driver of the Tesla Model S invovled, George McGee, was mostly responsible. McGee settled with the families in 2021.) The company was ordered to pay as much as $243 million in punitive and compensatory damages to Angulo and Benavides’ family. It’s a huge sum, though one that could be reduced on appeal.The verdict was highly unusual, insofar as the case even went to trial. Tesla has studiously avoided facing juries in fatality cases involving its driver assist technology, preferring to settle with the injured parties. But when it has gone to trial, it typically wins, such as in two previous cases in California.The winning streak came to an end with Schreiber’s case. We spoke a few days after the decision about his historic win, how he used Elon Musk’s own words to bolster his case, and how Tesla could face even steeper fines in the many lawsuits that are still pending.This transcript has been edited for clarity.I’ve been following Tesla for a very long time and I know that the company has a pretty solid track record of avoiding these types of judgments. Why do you think this was different? Well, I mean, they did make an overture to settle the case, and for a very large sum of money. Now, it was a fraction of the verdict, but the condition of the settlement was that it would be secret. And my clients were not interested in a secret settlement. They knew that this was a case and a cause that was bigger than themselves. And it was important to them that we shine a light on what Tesla has done.My theme in my closing argument was about Tesla’s choices and Tesla’s words. And to your point as to why they’ve been successful, I think it’s in part because there are two Teslas. There’s Tesla in the showroom and then there’s Tesla in the courtroom. And Tesla in the showroom tells you that they’ve invented the greatest full self-driving car the world has ever seen. Mr. Musk has been peddling to consumers and investors for more than a decade that the cars are fully self-driving, that the hardware is capable of full autonomy. And those statements were as untrue the day he said them as they remain untrue today. But then they showed up in a courtroom and they say, No, no, no, this is nothing but a driver assistance feature.“My theme in my closing argument was about Tesla’s choices and Tesla’s words.”Words matter. Choices matter. … Sometime later today or tomorrow, whenever the clerk finally approves it, all of the admitted trial exhibits are going to be publicly filed on the federal docket in Miami. Documents that only I and the lawyers involved in this case have seen, which shows Tesla knew about people’s constant misuse of their system. They knew how they were misusing it. They knew why they were misusing it. And they knew when they were misusing it — going back a decade. And I would encourage you and anybody else who’s been following these to log on to Pacer and to pull those exhibits down because they have never been seen before. And I have been saying for well over a year that the only way that this information was ever going to become public was inside of a courtroom.And we did that, and the only reason we were able to do that was because the Benavides and Angulo families were courageous enough to stand up to the largest corporation in the world and say, No, you are not going to settle this in secret. This is going to be shared publicly. And the jury saw it with a unanimous verdict. They sent a message to Tesla. Your choices and your words matter. Do better.I wanted to ask you about how you presented this case to the jury. And specifically the role that Elon Musk and his words played. You made a very pointed show of bringing in comments from the Tesla CEO. Can you talk a little bit about the role that you think that his past comments about Tesla’s Autopilot and about its self-driving capabilities played in this particular trial?[Musk’s words] were central to it. … The jury is asked about the expectation of an ordinary consumer. What would an ordinary consumer expect this vehicle to do? It didn’t actually matter what the driver himself thought. Now, his feelings about this were very consistent with Musk’s statements, but [Musk] says these things for a reason. He says this to create this idea in the public’s mind that these cars are more than they really are. He makes these comments going back to 2015. Autonomous driving is a solved problem. They are safer than humans. It will stop for anything. It knows if there’s something metal and something dense in front of it, it should stop. It doesn’t matter if it’s an alien spaceship, he said. And we played all of those because that aligns with the law.I said, “Look, you may know that Chick-fil-A has a cow running around telling you to eat more chicken and that LiMu the emu and Doug wants you to buy Liberty Mutual insurance. And that Geico has a gecko that peddles its products.” I said, “No one knows who Andrew Cathy, Tim Sweeney, and Todd Combs are. They are the CEOs of those companies. And even though no one’s heard of them … the decisions they make and the words they speak define what an ordinary consumer thinks of their company and the products they sell.” And I said the same is true of Elon Musk and Tesla. They cannot escape the fact that they have represented for a decade that they have invented and made a vehicle to the public that is the greatest, most advanced, Enhanced Autopilot driving vehicle the world has ever seen. And then they show up in court and they go, Well, there’s no vehicle in 2019 that would have ever stopped under this scenario. It’s a T intersection. It’s a broadside hit. Blah, blah, blah. Well, that opened the door to me to say, “You cannot make these statements publicly and then use as a defense in trial the fact that the car that you’ve claimed for a decade you invented doesn’t actually exist.” The jury saw through it.This is a type of technology that is used throughout the auto industry. Other car companies have a variety of ADAS technologies that are out there. What’s different about Tesla’s approach? And how did that contribute to this crash, in your opinion?[GM’s] Super Cruise, [Ford’s] BlueCruise, right, those were similar vintage-era Level 2 systems. They had driver monitoring systems that actually worked. They use infrared cameras. They had systems that were geofenced. You could only use them on certain roadways that they were designed for. A lot of other systems at the time, I think Infiniti, Nissan, Honda, and somebody else I can’t remember right now had a system where if you override the adaptive cruise control, the lane centering shuts off. Because you’re either going to use it or you’re not. Tesla didn’t do any of those things.That was their choice. And that goes back to the whole thing that makes them the outlier. This was not a car company that got into tech. This was a tech company that got into cars. And their production process was unlike what any other responsible automotive manufacturer has ever done. Rather than ensuring that things were ready for prime time, rather than releasing a finished product, they released a beta product. But they tell you, We call it beta, but we don’t really mean it’s beta. Again, they use words to the point where they become meaningless.“This was not a car company that got into tech. This was a tech company that got into cars.”I’m sure you’ve been online and you’ve seen some reactions to this verdict. I’ve seen a few from Tesla’s fan base, which is quite substantive. And they talk about how the technology in [George] McGee’s vehicle, Autopilot, is an outdated system — it hasn’t been updated since 2019 — whereas most of the current system, Full Self-Driving or FSD, is supposed to be measurably better. Do they have a point? Or is it beside the point for the outcome of this jury?For the outcome of this jury, it is beside the point. We could not introduce evidence about 2023 and 2024 and later developments. But I got news for the fan base. It’s not better. They’ve actually eliminated radar. They’ve got cameras only. It doesn’t work. Everyone who knows anything and who’s been following and paying attention in autonomous vehicle development for the last decade knows that the holy trinity of safety is lidar, radar, and cameras. You cannot create a camera-based-only system that is going to be better than a human driver. It’s not possible. It’s not done. They sure as heck haven’t done it. And their fusion system … continues to fail. You will see internal documents produced by Tesla where they determine that in 6 percent of the crashes that they received information on in 2019, they themselves determined that Autopilot was at fault.It’s so stupid, but my point is, it’s not better. It’s a three-legged stool. If you take one of the legs out, the other two fall down. Like I said, I’ll tip my hat to Waymo. I’ll tip my hat to those guys. They geofence. They three-dimensionally map, they tie in infrastructure, they use lidar, they use radar, they use cameras. Are they perfect? No. And that’s the other thing I want to be really clear about. We are not anti-autonomous vehicle technology. We are not anti-progress. To the contrary, we think this stuff can and will save lives. It just has to be done the right way. And Tesla’s done it the wrong way. And this unanimous jury who sat for three weeks listening to 40-plus hours a week of testimony and evidence felt the same way.Can you talk about what the trial revealed about how Tesla handles its Autopilot data, and also how it interacts with law enforcement when incidents arise and they need access to that data?The docket should be fully unsealed in about three weeks. The court has ordered that and has given Tesla an opportunity to file a brief about anything specifically they want to keep under wraps. I am confident that the motion that we brought for sanctions against them for withholding evidence for four years will become fully unsealed. It would be irresponsible for me to say more than what they’ve said, but suffice it to say, there is more to that story, and it will be set out.But to that end, I can say that Tesla has a system of gathering data. They receive it immediately after crashes. And it is a very fair, I would say almost generous, statement to say that they’re not always forthright with that information. And it’s in part because people just don’t understand it. Law enforcement doesn’t understand it. Government investigators don’t understand it. Through this case, we actually understood it better than even Tesla’s lawyers did. Now, the in-house people knew. And again, I can’t say whose decision it was to delete the data. But somebody at Tesla knew that if this information on the heels, six weeks later after the Jeremy Banner crash occurred in Deerfield Beach, Florida, that having another Autopilot fatality, that they knew that law enforcement wanted to share with federal investigators, they knew that would be bad for business. Why they did it? Only they can answer that question.“I am confident that the motion that we brought for sanctions against them for withholding evidence for four years will become fully unsealed.”Tesla right now is trying to roll out a robotaxi service in a number of cities. What would you say to people who are interested in this, curious about trying out these vehicles, to regulators who are weighing whether to approve Tesla’s requests?I would say that this verdict hopefully sends a very clear message to Tesla. That they need to do better. They need to elevate people’s lives and people’s safety over greed and profits. That’s what I told the jury in closing argument, that this was not only just an opportunity. I know that jury instruction talks about punishing Tesla and deterring bad conduct. But I told them really this was an opportunity for them to help Tesla, because when a company gets to a point where they’re elevating profits and greed over people’s lives and safety, then that is a company that has lost its way. That is a company that needs to have its course corrected. What I hope, through their efforts at developing a Level 4 system, is that Tesla will receive this message for what it was. It’s an opportunity and a teachable moment to be better.The problem is … it’s my understanding that it is a camera-only-based system. That’s a problem. Right, the megapixels on the cameras, on a 2025 Tesla, if they’re anything like what they’re putting on the robotaxi, have a lower megapixel resolution than my iPhone. The human eye is 250 megapixels. Be better. There’s a reason why responsible manufacturers are doing this differently. And again, is it hubris? Is it greed? I don’t know. I don’t know what this motivation is to double down and just try to do it the way that, Oh, we can do it this way and no one else can. I struggle with that. I’m just a lawyer. What do I know? But engineers, people who have spent decades, careers, lifetimes studying this stuff, have reached the same conclusions. So my hope is that they pause. They look at what they’re doing and they find ways to do it better. To do it safer.That’s what this verdict was about: sending a message that you cannot use our public roadways as your personal laboratory to test production vehicles. And then when you discover that an incident occurs, that you make an incremental change. That’s what their corporate representative said. And as I said a couple of times publicly and told the jury, an incident to the families impacted is known as a funeral. These are people’s lives that you’re playing with. So my hope is that they really think about their approach. I hope that consumers demand that they rethink their approach. I hope that analysts looking at the impact of this verdict and potential verdicts in the future tell them that they need to do better. Because I think that’s the only way that we’re going to ever see it really change. It seems likely that there will need to be more of these types of verdicts before we do see some change, either from the company or from the way that the market views the company.“Is it hubris? Is it greed? I don’t know.”Now that we have this verdict, I’m curious to know what you think it’s going to mean for future cases pending against Tesla?Round two, Maldonado v. Tesla, Alameda State Superior Court, 75 days from today. Tesla’s going to find out. I’m the plaintiff’s lawyer in that case. And I am not limited in California to a 3x multiplier on punitives. If I had asked that jury in Florida for a billion dollars, they would have given it to me. But I couldn’t ask them for that. Florida law says punitives can only be three times compensatories. I asked for $104 million in compensatories. They gave me $129 [million].I got to tell you something, as a trial lawyer, to get $25 million over your ask is unheard of. They would have given me anything I asked for, not because it was me, but because of the facts. The facts are a stubborn thing. And we get to tell those same facts with a better Autopilot defect theory. And I get to not only juxtapose Musk’s lies in that case, but I juxtapose them with the testimony that I didn’t have in Miami. I’ve only had this case for a year. I worked the Maldonado case from the beginning. And in that case, I have testimony from all of the senior Autopilot leadership: Sterling Anderson, CJ Moore, Andrej Karpathy. And I show them those same quotes that were played to that jury in Miami. I said, “When Mr. Musk said those things, was that a true statement about production vehicles at Tesla?” To a person, they answer: Absolutely not.He not only betrayed the public, he betrayed his own engineers. Betrayal is the most powerful human emotion there is, especially when it comes to rendering a verdict and holding a company account. That’s what round two is going to look like in 75 days. That should be very interesting.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Andrew J. HawkinsCloseAndrew J. HawkinsTransportation editorPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Andrew J. HawkinsAutonomous CarsCloseAutonomous CarsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All Autonomous CarsElectric CarsCloseElectric CarsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All Electric CarsElon MuskCloseElon MuskPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All Elon MuskLawCloseLawPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All LawPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechTeslaCloseTeslaPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TeslaTransportationCloseTransportationPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TransportationMost PopularMost PopularXbox chief Phil Spencer is leaving MicrosoftRead Microsoft gaming CEO Asha Sharma’s first memo on the future of XboxThe RAM shortage is coming for everything you care aboutAmazon blames human employees for an AI coding agent’s mistakeWill Stancil, man of the people or just an annoying guy?The Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All Tesla Posts from this topic will be added to your daily email digest and your homepage feed. See All Policy Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech The lawyer who beat Tesla is ready for ‘round two’ ‘There are two Teslas,’ attorney Brett Schreiber told us. ‘There’s Tesla in the showroom and then there’s Tesla in the courtroom.’ Posts from this author will be added to your daily email digest and your homepage feed. See All by Andrew J. Hawkins Posts from this topic will be added to your daily email digest and your homepage feed. See All Tesla Posts from this topic will be added to your daily email digest and your homepage feed. See All Policy Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech The lawyer who beat Tesla is ready for ‘round two’ ‘There are two Teslas,’ attorney Brett Schreiber told us. ‘There’s Tesla in the showroom and then there’s Tesla in the courtroom.’ Posts from this author will be added to your daily email digest and your homepage feed. See All by Andrew J. Hawkins Posts from this author will be added to your daily email digest and your homepage feed. See All by Andrew J. Hawkins The day after he won an unprecedented $243 million verdict in a wrongful death case against Tesla, attorney Brett Schreiber posted a reel on Instagram celebrating the victory. His song pick: 1992’s “Damn It Feels Good To Be a Gangsta” by the Geto Boys. “This is a verdict that will change the world,” Schreiber wrote in the caption, as Bushwick Bill, Willie D, and Scarface rap in the background about how “everything’s cool in the mind of a gangsta.” If that sounds like hyperbole, mixed with a dose of macho boasting, you’re not wrong. But in some sense, Schreiber earned his right to strut. The day before posting the reel, he stood in a Florida courtroom alongside his clients as a jury handed Tesla a major defeat. The company was partially responsible — 33 percent, to be exact — for a 2019 crash that killed 22-year-old Naibel Benavides and seriously injured her boyfriend, Dillon Angulo. (It was determined that the driver of the Tesla Model S invovled, George McGee, was mostly responsible. McGee settled with the families in 2021.) The company was ordered to pay as much as $243 million in punitive and compensatory damages to Angulo and Benavides’ family. It’s a huge sum, though one that could be reduced on appeal. The verdict was highly unusual, insofar as the case even went to trial. Tesla has studiously avoided facing juries in fatality cases involving its driver assist technology, preferring to settle with the injured parties. But when it has gone to trial, it typically wins, such as in two previous cases in California. The winning streak came to an end with Schreiber’s case. We spoke a few days after the decision about his historic win, how he used Elon Musk’s own words to bolster his case, and how Tesla could face even steeper fines in the many lawsuits that are still pending. This transcript has been edited for clarity. I’ve been following Tesla for a very long time and I know that the company has a pretty solid track record of avoiding these types of judgments. Why do you think this was different? Well, I mean, they did make an overture to settle the case, and for a very large sum of money. Now, it was a fraction of the verdict, but the condition of the settlement was that it would be secret. And my clients were not interested in a secret settlement. They knew that this was a case and a cause that was bigger than themselves. And it was important to them that we shine a light on what Tesla has done. My theme in my closing argument was about Tesla’s choices and Tesla’s words. And to your point as to why they’ve been successful, I think it’s in part because there are two Teslas. There’s Tesla in the showroom and then there’s Tesla in the courtroom. And Tesla in the showroom tells you that they’ve invented the greatest full self-driving car the world has ever seen. Mr. Musk has been peddling to consumers and investors for more than a decade that the cars are fully self-driving, that the hardware is capable of full autonomy. And those statements were as untrue the day he said them as they remain untrue today. But then they showed up in a courtroom and they say, No, no, no, this is nothing but a driver assistance feature. “My theme in my closing argument was about Tesla’s choices and Tesla’s words.” Words matter. Choices matter. … Sometime later today or tomorrow, whenever the clerk finally approves it, all of the admitted trial exhibits are going to be publicly filed on the federal docket in Miami. Documents that only I and the lawyers involved in this case have seen, which shows Tesla knew about people’s constant misuse of their system. They knew how they were misusing it. They knew why they were misusing it. And they knew when they were misusing it — going back a decade. And I would encourage you and anybody else who’s been following these to log on to Pacer and to pull those exhibits down because they have never been seen before. And I have been saying for well over a year that the only way that this information was ever going to become public was inside of a courtroom. And we did that, and the only reason we were able to do that was because the Benavides and Angulo families were courageous enough to stand up to the largest corporation in the world and say, No, you are not going to settle this in secret. This is going to be shared publicly. And the jury saw it with a unanimous verdict. They sent a message to Tesla. Your choices and your words matter. Do better. I wanted to ask you about how you presented this case to the jury. And specifically the role that Elon Musk and his words played. You made a very pointed show of bringing in comments from the Tesla CEO. Can you talk a little bit about the role that you think that his past comments about Tesla’s Autopilot and about its self-driving capabilities played in this particular trial? [Musk’s words] were central to it. … The jury is asked about the expectation of an ordinary consumer. What would an ordinary consumer expect this vehicle to do? It didn’t actually matter what the driver himself thought. Now, his feelings about this were very consistent with Musk’s statements, but [Musk] says these things for a reason. He says this to create this idea in the public’s mind that these cars are more than they really are. He makes these comments going back to 2015. Autonomous driving is a solved problem. They are safer than humans. It will stop for anything. It knows if there’s something metal and something dense in front of it, it should stop. It doesn’t matter if it’s an alien spaceship, he said. And we played all of those because that aligns with the law. I said, “Look, you may know that Chick-fil-A has a cow running around telling you to eat more chicken and that LiMu the emu and Doug wants you to buy Liberty Mutual insurance. And that Geico has a gecko that peddles its products.” I said, “No one knows who Andrew Cathy, Tim Sweeney, and Todd Combs are. They are the CEOs of those companies. And even though no one’s heard of them … the decisions they make and the words they speak define what an ordinary consumer thinks of their company and the products they sell.” And I said the same is true of Elon Musk and Tesla. They cannot escape the fact that they have represented for a decade that they have invented and made a vehicle to the public that is the greatest, most advanced, Enhanced Autopilot driving vehicle the world has ever seen. And then they show up in court and they go, Well, there’s no vehicle in 2019 that would have ever stopped under this scenario. It’s a T intersection. It’s a broadside hit. Blah, blah, blah. Well, that opened the door to me to say, “You cannot make these statements publicly and then use as a defense in trial the fact that the car that you’ve claimed for a decade you invented doesn’t actually exist.” The jury saw through it. This is a type of technology that is used throughout the auto industry. Other car companies have a variety of ADAS technologies that are out there. What’s different about Tesla’s approach? And how did that contribute to this crash, in your opinion? [GM’s] Super Cruise, [Ford’s] BlueCruise, right, those were similar vintage-era Level 2 systems. They had driver monitoring systems that actually worked. They use infrared cameras. They had systems that were geofenced. You could only use them on certain roadways that they were designed for. A lot of other systems at the time, I think Infiniti, Nissan, Honda, and somebody else I can’t remember right now had a system where if you override the adaptive cruise control, the lane centering shuts off. Because you’re either going to use it or you’re not. Tesla didn’t do any of those things. That was their choice. And that goes back to the whole thing that makes them the outlier. This was not a car company that got into tech. This was a tech company that got into cars. And their production process was unlike what any other responsible automotive manufacturer has ever done. Rather than ensuring that things were ready for prime time, rather than releasing a finished product, they released a beta product. But they tell you, We call it beta, but we don’t really mean it’s beta. Again, they use words to the point where they become meaningless. “This was not a car company that got into tech. This was a tech company that got into cars.” I’m sure you’ve been online and you’ve seen some reactions to this verdict. I’ve seen a few from Tesla’s fan base, which is quite substantive. And they talk about how the technology in [George] McGee’s vehicle, Autopilot, is an outdated system — it hasn’t been updated since 2019 — whereas most of the current system, Full Self-Driving or FSD, is supposed to be measurably better. Do they have a point? Or is it beside the point for the outcome of this jury? For the outcome of this jury, it is beside the point. We could not introduce evidence about 2023 and 2024 and later developments. But I got news for the fan base. It’s not better. They’ve actually eliminated radar. They’ve got cameras only. It doesn’t work. Everyone who knows anything and who’s been following and paying attention in autonomous vehicle development for the last decade knows that the holy trinity of safety is lidar, radar, and cameras. You cannot create a camera-based-only system that is going to be better than a human driver. It’s not possible. It’s not done. They sure as heck haven’t done it. And their fusion system … continues to fail. You will see internal documents produced by Tesla where they determine that in 6 percent of the crashes that they received information on in 2019, they themselves determined that Autopilot was at fault. It’s so stupid, but my point is, it’s not better. It’s a three-legged stool. If you take one of the legs out, the other two fall down. Like I said, I’ll tip my hat to Waymo. I’ll tip my hat to those guys. They geofence. They three-dimensionally map, they tie in infrastructure, they use lidar, they use radar, they use cameras. Are they perfect? No. And that’s the other thing I want to be really clear about. We are not anti-autonomous vehicle technology. We are not anti-progress. To the contrary, we think this stuff can and will save lives. It just has to be done the right way. And Tesla’s done it the wrong way. And this unanimous jury who sat for three weeks listening to 40-plus hours a week of testimony and evidence felt the same way. Can you talk about what the trial revealed about how Tesla handles its Autopilot data, and also how it interacts with law enforcement when incidents arise and they need access to that data? The docket should be fully unsealed in about three weeks. The court has ordered that and has given Tesla an opportunity to file a brief about anything specifically they want to keep under wraps. I am confident that the motion that we brought for sanctions against them for withholding evidence for four years will become fully unsealed. It would be irresponsible for me to say more than what they’ve said, but suffice it to say, there is more to that story, and it will be set out. But to that end, I can say that Tesla has a system of gathering data. They receive it immediately after crashes. And it is a very fair, I would say almost generous, statement to say that they’re not always forthright with that information. And it’s in part because people just don’t understand it. Law enforcement doesn’t understand it. Government investigators don’t understand it. Through this case, we actually understood it better than even Tesla’s lawyers did. Now, the in-house people knew. And again, I can’t say whose decision it was to delete the data. But somebody at Tesla knew that if this information on the heels, six weeks later after the Jeremy Banner crash occurred in Deerfield Beach, Florida, that having another Autopilot fatality, that they knew that law enforcement wanted to share with federal investigators, they knew that would be bad for business. Why they did it? Only they can answer that question. “I am confident that the motion that we brought for sanctions against them for withholding evidence for four years will become fully unsealed.” Tesla right now is trying to roll out a robotaxi service in a number of cities. What would you say to people who are interested in this, curious about trying out these vehicles, to regulators who are weighing whether to approve Tesla’s requests? I would say that this verdict hopefully sends a very clear message to Tesla. That they need to do better. They need to elevate people’s lives and people’s safety over greed and profits. That’s what I told the jury in closing argument, that this was not only just an opportunity. I know that jury instruction talks about punishing Tesla and deterring bad conduct. But I told them really this was an opportunity for them to help Tesla, because when a company gets to a point where they’re elevating profits and greed over people’s lives and safety, then that is a company that has lost its way. That is a company that needs to have its course corrected. What I hope, through their efforts at developing a Level 4 system, is that Tesla will receive this message for what it was. It’s an opportunity and a teachable moment to be better. The problem is … it’s my understanding that it is a camera-only-based system. That’s a problem. Right, the megapixels on the cameras, on a 2025 Tesla, if they’re anything like what they’re putting on the robotaxi, have a lower megapixel resolution than my iPhone. The human eye is 250 megapixels. Be better. There’s a reason why responsible manufacturers are doing this differently. And again, is it hubris? Is it greed? I don’t know. I don’t know what this motivation is to double down and just try to do it the way that, Oh, we can do it this way and no one else can. I struggle with that. I’m just a lawyer. What do I know? But engineers, people who have spent decades, careers, lifetimes studying this stuff, have reached the same conclusions. So my hope is that they pause. They look at what they’re doing and they find ways to do it better. To do it safer. That’s what this verdict was about: sending a message that you cannot use our public roadways as your personal laboratory to test production vehicles. And then when you discover that an incident occurs, that you make an incremental change. That’s what their corporate representative said. And as I said a couple of times publicly and told the jury, an incident to the families impacted is known as a funeral. These are people’s lives that you’re playing with. So my hope is that they really think about their approach. I hope that consumers demand that they rethink their approach. I hope that analysts looking at the impact of this verdict and potential verdicts in the future tell them that they need to do better. Because I think that’s the only way that we’re going to ever see it really change. It seems likely that there will need to be more of these types of verdicts before we do see some change, either from the company or from the way that the market views the company. “Is it hubris? Is it greed? I don’t know.” Now that we have this verdict, I’m curious to know what you think it’s going to mean for future cases pending against Tesla? Round two, Maldonado v. Tesla, Alameda State Superior Court, 75 days from today. Tesla’s going to find out. I’m the plaintiff’s lawyer in that case. And I am not limited in California to a 3x multiplier on punitives. If I had asked that jury in Florida for a billion dollars, they would have given it to me. But I couldn’t ask them for that. Florida law says punitives can only be three times compensatories. I asked for $104 million in compensatories. They gave me $129 [million]. I got to tell you something, as a trial lawyer, to get $25 million over your ask is unheard of. They would have given me anything I asked for, not because it was me, but because of the facts. The facts are a stubborn thing. And we get to tell those same facts with a better Autopilot defect theory. And I get to not only juxtapose Musk’s lies in that case, but I juxtapose them with the testimony that I didn’t have in Miami. I’ve only had this case for a year. I worked the Maldonado case from the beginning. And in that case, I have testimony from all of the senior Autopilot leadership: Sterling Anderson, CJ Moore, Andrej Karpathy. And I show them those same quotes that were played to that jury in Miami. I said, “When Mr. Musk said those things, was that a true statement about production vehicles at Tesla?” To a person, they answer: Absolutely not. He not only betrayed the public, he betrayed his own engineers. Betrayal is the most powerful human emotion there is, especially when it comes to rendering a verdict and holding a company account. That’s what round two is going to look like in 75 days. That should be very interesting. Posts from this author will be added to your daily email digest and your homepage feed. See All by Andrew J. Hawkins Posts from this topic will be added to your daily email digest and your homepage feed. See All Autonomous Cars Posts from this topic will be added to your daily email digest and your homepage feed. See All Electric Cars Posts from this topic will be added to your daily email digest and your homepage feed. See All Elon Musk Posts from this topic will be added to your daily email digest and your homepage feed. See All Law Posts from this topic will be added to your daily email digest and your homepage feed. See All Policy Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech Posts from this topic will be added to your daily email digest and your homepage feed. See All Tesla Posts from this topic will be added to your daily email digest and your homepage feed. See All Transportation Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in Tesla This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Center_for_AI_Safety] | [TOKENS: 318] |
Contents Center for AI Safety The Center for AI Safety (CAIS) is an American nonprofit organization based in San Francisco that promotes the safe development and deployment of artificial intelligence (AI). CAIS's work encompasses research in technical AI safety and AI ethics, advocacy, and support to grow the AI safety research field. It was founded in 2022 by Dan Hendrycks and Oliver Zhang. In May 2023, CAIS published the statement on AI risk of extinction signed by hundreds of professors of AI, leaders of major AI companies, and other public figures. Research CAIS researchers published "An Overview of Catastrophic AI Risks", which details risk scenarios and risk mitigation strategies. Risks described include the use of AI in autonomous warfare or for engineering pandemics, as well as AI capabilities for deception and hacking. Another work, conducted in collaboration with researchers at Carnegie Mellon University, described an automated way to discover adversarial attacks of large language models, that bypass safety measures, highlighting the inadequacy of current safety systems. Other activities Other initiatives include a compute cluster to support AI safety research, an online course titled "Intro to ML Safety", and a fellowship for philosophy professors to address conceptual problems. The Center for AI Safety Action Fund is a sponsor of the California bill SB 1047, the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act. In 2023, the cryptocurrency exchange FTX, which went bankrupt in November 2022, attempted to recoup $6.5 million that it had donated to CAIS earlier that year. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Television_Newsreel] | [TOKENS: 1009] |
Contents Television Newsreel Television Newsreel is a British television programme, the first regular news programme to be made in the UK. Produced by the BBC and screened on the BBC Television Service from 1948 to 1954 at 7.30 pm, it adapted the traditional cinema newsreel form for the television audience, covering news and current affairs stories as well as quirkier 'human interest' items, sports and cultural events. The programme's opening title sequence, featuring a graphic of the transmission mast at Alexandra Palace with the title revolving around it, became a well-known image of the time. The theme tune was "Girls in Grey" by Charles Williams and played by the Queen's Hall Light Orchestra. It was published by Chappell on one of its mood music records – it was not specifically written for the newsreel but composed during World War Two for the Women's Junior Air Corps. Overview Previously, the BBC had screened cinema newsreels from British Movietone News, as well as sound-only news bulletins from BBC Radio. Following the resumption of the television service in 1946, after its World War II hiatus, a BBC Film Unit was set up to produce items on film. This contrasted with the vast majority of the BBC's output of the time, which was transmitted live via the electronic cameras of the Alexandra Palace studios. The first Television Newsreel was shown on Monday 5 January 1948. Each edition was fifteen minutes long. It consisted of a number of different items, tending to be fewer and longer in length than in cinema newsreels, most of which ran for only ten minutes in total. The items would have different presenters, and would be linked by a narrated voiceover. The producer was Harold Cox, and D. A. Smith was the editor. The chief cameraman was Alan Lawson, and J. K. Byer was head of sound recordists. Editions would initially be broadcast on Monday, Wednesday and Saturday evenings. From April 1950 a special Children's Newsreel edition would be shown on Saturday afternoons, for the benefit of the younger audience. Items from the United States were often used, produced by the NBC network within that country. The BBC had a film exchange deal with the American broadcaster, where they would swap film reports they had produced. From 1951, a weekly Newsreel Review of the Week was produced to open programming on Sunday evenings, compiling highlights from the previous week's newsreel features. These weekly editions would be presented by Edward Halliday, who sometimes appeared on-screen to link the various items. Due to the pre-prepared nature of the Newsreel, topicality and coverage of breaking news stories was impossible. Newsreel was not a true news programme, as the term is understood in the 21st century. The series was regarded more as entertainment, while more serious news bulletins were produced by BBC Radio. These radio bulletins were sometimes broadcast on television, in sound only. The final edition of the series was broadcast on Sunday 4 July 1954. The following Monday, 5 July 1954, the first BBC News programme was broadcast. The new programme was presented live in the studio by a newsreader. This newsreader was initially unseen and unnamed, because it was felt that identifying the news with one personality would detract from its seriousness. The newsreader linked the reports in the manner currently familiar for news broadcasting. The new programme was initially titled News and Newsreel. After a short while the Newsreel portion was dropped, severing the last link with the Television Newsreel strand. Children's Newsreel, unlike the later Newsround, made no pretence at being a serious news report. This version begun in April 1950 and continued until September 1961. It outlived its adult parent series by seven years. Archive status The programmes were pre-shot on film, as opposed to being shown live. This policy was unlike most of the BBC's output from the late 1940s. Examples of Television Newsreel do survive in the archives. They are some of the oldest extant pieces of BBC-produced television programming. But complete editions with the original linking narration are rare. The individual reports were designed to be re-used in shows such as Newsreel Review of the Week and the end-of-year review Scrapbook. Consequently, reports were archived separately, rather than as complete editions of the programme. Many of the reports survive due to the negatives having been donated to the National Film Archive at the British Film Institute in the early 1950s. They were the first ever television material to be acquired by the archive, which currently has an extensive collection of broadcast programmes. The BBC donated these on condition that they could have access to them whenever they desired. They subsequently made copies of the donated films for their own archives. See also Footnotes References External links |below = Category }} |
======================================== |
[SOURCE: https://www.ynet.co.il/news/article/rk9jnfi00bg] | [TOKENS: 298] |
מילה אחת בחוק עצרה את הנשיא טראמפ כשביהמ"ש האמריקני בלם את תוכנית המכסים של טראמפ - הוא לא פסל מדיניות אלא שירטט את גבולות הגזרה של הכוח. במערכת דמוקרטית, "להסדיר" זה לא צ'ק פתוח, ושלטון הרוב לא עומד מעל החוק והפרדת הרשויות. זה שיעור חיוני לישראל: ללא חוקה, בית משפט עצמאי הוא המחסום האחרון מפני שלטון ללא גבולות |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/XLfit] | [TOKENS: 251] |
Contents XLfit XLfit is a Microsoft Excel add-in that can perform regression analysis, curve fitting, and statistical analysis. It is approved by the UK National Physical Laboratory and the US National Institute of Standards and Technology XLfit can generate 2D and 3D graphs and analyze data sets. XLfit can also analyse the statistical data. It includes over seventy linear and non-linear curve fitting models. Predefined categories include: A range of statistical calculations can also be applied to the data from within the spreadsheet. Example statistics include:[further explanation needed] Available statistical models include Spearman's rank correlations, Student's t-test, Mann–Whitney U test, least squares, and ANOVA, among others.[citation needed] A model editor also allows users to add their own models and statistics to the ones provided. Usage XLfit was validated by The UK National Physical Laboratory in 2004; the unit tests for this are provided in the model editor from version 5.4 onwards to allow each version to be easily validated. XLfit was used by NASA to analyze the battery life of the Curiosity Mars Lander. Licenses XLift is a proprietary software. It offers the following licensing options. Version history See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Facebook_Bluetooth_Beacon] | [TOKENS: 103] |
Contents Facebook Bluetooth Beacon The Facebook Bluetooth Beacon is a hardware beacon released by Facebook in 2015. The beacon uses a bluetooth connection to communicate with the Facebook app on the user's smartphone, informing it of the phone's location. The technology allows location-specific advertising to be pushed to the user's Facebook feed. In June 2015, Facebook gave free beacons to a number of businesses in the United States. See also References This technology-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/TradingView] | [TOKENS: 257] |
Contents TradingView TradingView is an American-British social media network, social trading network, financial analysis platform and mobile app for traders and investors. The company was founded in 2011 and has offices in New York, Málaga, Tbilisi and London. As of 2020, the company ranks in the top 130 websites globally according to Alexa. History TradingView was founded in 2011. TradingView is headquartered in New York and has its European market headquarters in London. The platform aims to help users around the world better understand financial markets by discussing investment ideas in an open forum. In the summer of 2013, the project was selected for the startup accelerator Techstars, where it signed contracts with Microsoft and CME. The company later received $3.6 million in funding from iTech Capital and other investors (TechStars, Right Side Capital Management, Irish Angels). In May 2018, the company closed another round of venture investment for $37 million led by Insight Venture Partners, DRW Venture Capital and Jump Capital, acquiring TradeIT. In October 2021, as part of the next round of financing, TradingView was valued at $3 billion and attracted an additional $298 million from investors, including Tiger Global Management. References |
======================================== |
[SOURCE: https://www.mako.co.il/makoz-news/Article-8d04435100f7c91026.htm] | [TOKENS: 93] |
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/makoz-news/Article-8d04435100f7c91026.htm from 79.181.162.231 on February 21 2026, 10:56:12 UTC |
======================================== |
[SOURCE: https://www.theverge.com/news/768068/tesla-wrongful-death-verdict-court-toss] | [TOKENS: 1876] |
NewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsTransportationCloseTransportationPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TransportationElectric CarsCloseElectric CarsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All Electric CarsTesla asks court to toss wrongful death verdict that cost it $243 millionThe company says the plaintiffs shouldn’t have been allowed to involve Elon Musk’s past statements about autonomy.The company says the plaintiffs shouldn’t have been allowed to involve Elon Musk’s past statements about autonomy.by Andrew J. HawkinsCloseAndrew J. HawkinsTransportation editorPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Andrew J. HawkinsAug 29, 2025, 3:32 PM UTCLinkShareGiftIllustration by Alex Castro / The VergeAndrew J. HawkinsCloseAndrew J. HawkinsPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Andrew J. Hawkins is transportation editor with 10+ years of experience who covers EVs, public transportation, and aviation. His work has appeared in The New York Daily News and City & State.Lawyers for Tesla filed a motion in court Friday to throw out a jury verdict that found the company’s Autopilot software had contributed to the death of a woman in a crash from 2019.Earlier this month, a jury found Tesla partially responsible for the death of 22-year-old Naibel Benavides, who was killed by a Model S driver who plowed into her and her boyfriend Dillon Angulo. Tesla was ordered to pay the families of the victims $243 million in compensatory and punitive damages, a stunning outcome for a company that has managed to avoid taking responsibility for crashes involving its partially autonomous software.RelatedThe lawyer who beat Tesla is ready for ‘round two’In the filing, Tesla’s legal team said the Model S driver bore all the responsibility for the crash. And they are requesting the court invalidate the verdict, or at least order a new jury trial.“The $243 million judgment against Tesla flies in the face of basic Florida tort law, the Due Process Clause, and common sense,” the company’s lawyers write, noting that McGee had pressed the accelerator to override Autopilot in the seconds before the crash. “Auto manufacturers do not insure the world against harms caused by reckless drivers.”The lawyers also claim that the plaintiffs should not have been allowed to enter into evidence statements from Tesla CEO Elon Musk, who has long claimed that the company’s vehicles are capable of higher levels of autonomy than they actually are. And they called claims about data coverup on the part of Tesla — the company was accused of withholding camera data from police investigating the crash — were false and “inflamed” the jury against the company.The motion was filed by attorneys from Gibson Dunn, a firm that represented Tesla in a lawsuit against a former employee and a tech startup accused of stealing trade secrets for a robotic hand.Update August 29th: A spokesperson for plaintiff attorney Brett Schreiber sent the following statement:“This motion is the latest example of Tesla and Musk’s complete disregard for the human cost of their defective technology. The jury heard all the facts and came to the right conclusion that this was a case of shared responsibility, but that does not discount the integral role Autopilot and the company’s misrepresentations of its capabilities played in the crash that killed Naibel and permanently injured Dillon. We are confident the court will uphold this verdict, which serves not as an indictment of the autonomous vehicle industry, but of Tesla’s reckless and unsafe development and deployment of its Autopilot system.”Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Andrew J. HawkinsCloseAndrew J. HawkinsTransportation editorPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Andrew J. HawkinsElectric CarsCloseElectric CarsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All Electric CarsNewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsTeslaCloseTeslaPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TeslaTransportationCloseTransportationPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TransportationMost PopularMost PopularXbox chief Phil Spencer is leaving MicrosoftRead Microsoft gaming CEO Asha Sharma’s first memo on the future of XboxThe RAM shortage is coming for everything you care aboutAmazon blames human employees for an AI coding agent’s mistakeWill Stancil, man of the people or just an annoying guy?The Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Transportation Posts from this topic will be added to your daily email digest and your homepage feed. See All Electric Cars Tesla asks court to toss wrongful death verdict that cost it $243 million The company says the plaintiffs shouldn’t have been allowed to involve Elon Musk’s past statements about autonomy. The company says the plaintiffs shouldn’t have been allowed to involve Elon Musk’s past statements about autonomy. Posts from this author will be added to your daily email digest and your homepage feed. See All by Andrew J. Hawkins Posts from this author will be added to your daily email digest and your homepage feed. See All by Andrew J. Hawkins Lawyers for Tesla filed a motion in court Friday to throw out a jury verdict that found the company’s Autopilot software had contributed to the death of a woman in a crash from 2019. Earlier this month, a jury found Tesla partially responsible for the death of 22-year-old Naibel Benavides, who was killed by a Model S driver who plowed into her and her boyfriend Dillon Angulo. Tesla was ordered to pay the families of the victims $243 million in compensatory and punitive damages, a stunning outcome for a company that has managed to avoid taking responsibility for crashes involving its partially autonomous software. In the filing, Tesla’s legal team said the Model S driver bore all the responsibility for the crash. And they are requesting the court invalidate the verdict, or at least order a new jury trial. “The $243 million judgment against Tesla flies in the face of basic Florida tort law, the Due Process Clause, and common sense,” the company’s lawyers write, noting that McGee had pressed the accelerator to override Autopilot in the seconds before the crash. “Auto manufacturers do not insure the world against harms caused by reckless drivers.” The lawyers also claim that the plaintiffs should not have been allowed to enter into evidence statements from Tesla CEO Elon Musk, who has long claimed that the company’s vehicles are capable of higher levels of autonomy than they actually are. And they called claims about data coverup on the part of Tesla — the company was accused of withholding camera data from police investigating the crash — were false and “inflamed” the jury against the company. The motion was filed by attorneys from Gibson Dunn, a firm that represented Tesla in a lawsuit against a former employee and a tech startup accused of stealing trade secrets for a robotic hand. Update August 29th: A spokesperson for plaintiff attorney Brett Schreiber sent the following statement: “This motion is the latest example of Tesla and Musk’s complete disregard for the human cost of their defective technology. The jury heard all the facts and came to the right conclusion that this was a case of shared responsibility, but that does not discount the integral role Autopilot and the company’s misrepresentations of its capabilities played in the crash that killed Naibel and permanently injured Dillon. We are confident the court will uphold this verdict, which serves not as an indictment of the autonomous vehicle industry, but of Tesla’s reckless and unsafe development and deployment of its Autopilot system.” Posts from this author will be added to your daily email digest and your homepage feed. See All by Andrew J. Hawkins Posts from this topic will be added to your daily email digest and your homepage feed. See All Electric Cars Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Tesla Posts from this topic will be added to your daily email digest and your homepage feed. See All Transportation Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in News This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://www.wired.com/video/watch/how-to-organize-safely-in-the-age-of-surveillance] | [TOKENS: 132] |
How to Organize Safely in the Age of Surveillance Released on 02/20/2026 Trending video Collectibles Expert Answers Collectibles Questions Olympian Answers Figure Skating Questions Paralympian Answers Paralympics Questions I Escaped Chinese Mafia Crypto Slavery Professor Answers Olympic History Questions © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_ref-124] | [TOKENS: 6011] |
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links |
======================================== |
[SOURCE: https://www.wired.com/video] | [TOKENS: 118] |
Video Olympian Answers Figure Skating Questions Paralympian Answers Paralympics Questions The Big Interview The Big Interview - Live Incognito Mode On the Grid Hacklab Tech Support 50 Most Searched Questions Autocomplete Interviews © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Jews#cite_note-187] | [TOKENS: 15852] |
Contents Jews Jews (Hebrew: יְהוּדִים, ISO 259-2: Yehudim, Israeli pronunciation: [jehuˈdim]), or the Jewish people, are an ethnoreligious group and nation, originating from the Israelites of ancient Israel and Judah. They traditionally adhere to Judaism. Jewish ethnicity, religion, and community are highly interrelated, as Judaism is an ethnic religion, though many ethnic Jews do not practice it. Religious Jews regard converts to Judaism as members of the Jewish nation, pursuant to the long-standing conversion process. The Israelites emerged from the pre-existing Canaanite peoples to establish Israel and Judah in the Southern Levant during the Iron Age. Originally, Jews referred to the inhabitants of the kingdom of Judah and were distinguished from the gentiles and the Samaritans. According to the Hebrew Bible, these inhabitants predominately originate from the tribe of Judah, who were descendants of Judah, the fourth son of Jacob. The tribe of Benjamin were another significant demographic in Judah and were considered Jews too. By the late 6th century BCE, Judaism had evolved from the Israelite religion, dubbed Yahwism (for Yahweh) by modern scholars, having a theology that religious Jews believe to be the expression of the Mosaic covenant between God and the Jewish people. After the Babylonian exile, Jews referred to followers of Judaism, descendants of the Israelites, citizens of Judea, or allies of the Judean state. Jewish migration within the Mediterranean region during the Hellenistic period, followed by population transfers, caused by events like the Jewish–Roman wars, gave rise to the Jewish diaspora, consisting of diverse Jewish communities that maintained their sense of Jewish history, identity, and culture. In the following millennia, Jewish diaspora communities coalesced into three major ethnic subdivisions according to where their ancestors settled: the Ashkenazim (Central and Eastern Europe), the Sephardim (Iberian Peninsula), and the Mizrahim (Middle East and North Africa). While these three major divisions account for most of the world's Jews, there are other smaller Jewish groups outside of the three. Prior to World War II, the global Jewish population reached a peak of 16.7 million, representing around 0.7% of the world's population at that time. During World War II, approximately six million Jews throughout Europe were systematically murdered by Nazi Germany in a genocide known as the Holocaust. Since then, the population has slowly risen again, and as of 2021[update], was estimated to be at 15.2 million by the demographer Sergio Della Pergola or less than 0.2% of the total world population in 2012.[b] Today, over 85% of Jews live in Israel or the United States. Israel, whose population is 73.9% Jewish, is the only country where Jews comprise more than 2.5% of the population. Jews have significantly influenced and contributed to the development and growth of human progress in many fields, both historically and in modern times, including in science and technology, philosophy, ethics, literature, governance, business, art, music, comedy, theatre, cinema, architecture, food, medicine, and religion. Jews founded Christianity and had an indirect but profound influence on Islam. In these ways and others, Jews have played a significant role in the development of Western culture. Name and etymology The term "Jew" is derived from the Hebrew word יְהוּדִי Yehudi, with the plural יְהוּדִים Yehudim. Endonyms in other Jewish languages include the Ladino ג׳ודיו Djudio (plural ג׳ודיוס, Djudios) and the Yiddish ייִד Yid (plural ייִדן Yidn). Though Genesis 29:35 and 49:8 connect "Judah" with the verb yada, meaning "praise", scholars generally agree that "Judah" most likely derives from the name of a Levantine geographic region dominated by gorges and ravines. The gradual ethnonymic shift from "Israelites" to "Jews", regardless of their descent from Judah, although not contained in the Torah, is made explicit in the Book of Esther (4th century BCE) of the Tanakh. Some modern scholars disagree with the conflation, based on the works of Josephus, Philo and Apostle Paul. The English word "Jew" is a derivation of Middle English Gyw, Iewe. The latter was loaned from the Old French giu, which itself evolved from the earlier juieu, which in turn derived from judieu/iudieu which through elision had dropped the letter "d" from the Medieval Latin Iudaeus, which, like the New Testament Greek term Ioudaios, meant both "Jew" and "Judean" / "of Judea". The Greek term was a loan from Aramaic *yahūdāy, corresponding to Hebrew יְהוּדִי Yehudi. Some scholars prefer translating Ioudaios as "Judean" in the Bible since it is more precise, denotes the community's origins and prevents readers from engaging in antisemitic eisegesis. Others disagree, believing that it erases the Jewish identity of Biblical characters such as Jesus. Daniel R. Schwartz distinguishes "Judean" and "Jew". Here, "Judean" refers to the inhabitants of Judea, which encompassed southern Palestine. Meanwhile, "Jew" refers to the descendants of Israelites that adhere to Judaism. Converts are included in the definition. But Shaye J.D. Cohen argues that "Judean" is inclusive of believers of the Judean God and allies of the Judean state. Another scholar, Jodi Magness, wrote the term Ioudaioi refers to a "people of Judahite/Judean ancestry who worshipped the God of Israel as their national deity and (at least nominally) lived according to his laws." The etymological equivalent is in use in other languages, e.g., يَهُودِيّ yahūdī (sg.), al-yahūd (pl.), in Arabic, "Jude" in German, "judeu" in Portuguese, "Juif" (m.)/"Juive" (f.) in French, "jøde" in Danish and Norwegian, "judío/a" in Spanish, "jood" in Dutch, "żyd" in Polish etc., but derivations of the word "Hebrew" are also in use to describe a Jew, e.g., in Italian (Ebreo), in Persian ("Ebri/Ebrani" (Persian: عبری/عبرانی)) and Russian (Еврей, Yevrey). The German word "Jude" is pronounced [ˈjuːdə], the corresponding adjective "jüdisch" [ˈjyːdɪʃ] (Jewish) is the origin of the word "Yiddish". According to The American Heritage Dictionary of the English Language, fourth edition (2000), It is widely recognized that the attributive use of the noun Jew, in phrases such as Jew lawyer or Jew ethics, is both vulgar and highly offensive. In such contexts Jewish is the only acceptable possibility. Some people, however, have become so wary of this construction that they have extended the stigma to any use of Jew as a noun, a practice that carries risks of its own. In a sentence such as There are now several Jews on the council, which is unobjectionable, the substitution of a circumlocution like Jewish people or persons of Jewish background may in itself cause offense for seeming to imply that Jew has a negative connotation when used as a noun. Identity Judaism shares some of the characteristics of a nation, an ethnicity, a religion, and a culture, making the definition of who is a Jew vary slightly depending on whether a religious or national approach to identity is used.[better source needed] Generally, in modern secular usage, Jews include three groups: people who were born to a Jewish family regardless of whether or not they follow the religion, those who have some Jewish ancestral background or lineage (sometimes including those who do not have strictly matrilineal descent), and people without any Jewish ancestral background or lineage who have formally converted to Judaism and therefore are followers of the religion. In the context of biblical and classical literature, Jews could refer to inhabitants of the Kingdom of Judah, or the broader Judean region, allies of the Judean state, or anyone that followed Judaism. Historical definitions of Jewish identity have traditionally been based on halakhic definitions of matrilineal descent, and halakhic conversions. These definitions of who is a Jew date back to the codification of the Oral Torah into the Babylonian Talmud, around 200 CE. Interpretations by Jewish sages of sections of the Tanakh – such as Deuteronomy 7:1–5, which forbade intermarriage between their Israelite ancestors and seven non-Israelite nations: "for that [i.e. giving your daughters to their sons or taking their daughters for your sons,] would turn away your children from following me, to serve other gods"[failed verification] – are used as a warning against intermarriage between Jews and gentiles. Leviticus 24:10 says that the son in a marriage between a Hebrew woman and an Egyptian man is "of the community of Israel." This is complemented by Ezra 10:2–3, where Israelites returning from Babylon vow to put aside their gentile wives and their children. A popular theory is that the rape of Jewish women in captivity brought about the law of Jewish identity being inherited through the maternal line, although scholars challenge this theory citing the Talmudic establishment of the law from the pre-exile period. Another argument is that the rabbis changed the law of patrilineal descent to matrilineal descent due to the widespread rape of Jewish women by Roman soldiers. Since the anti-religious Haskalah movement of the late 18th and 19th centuries, halakhic interpretations of Jewish identity have been challenged. According to historian Shaye J. D. Cohen, the status of the offspring of mixed marriages was determined patrilineally in the Bible. He brings two likely explanations for the change in Mishnaic times: first, the Mishnah may have been applying the same logic to mixed marriages as it had applied to other mixtures (Kil'ayim). Thus, a mixed marriage is forbidden as is the union of a horse and a donkey, and in both unions the offspring are judged matrilineally. Second, the Tannaim may have been influenced by Roman law, which dictated that when a parent could not contract a legal marriage, offspring would follow the mother. Rabbi Rivon Krygier follows a similar reasoning, arguing that Jewish descent had formerly passed through the patrilineal descent and the law of matrilineal descent had its roots in the Roman legal system. Origins The prehistory and ethnogenesis of the Jews are closely intertwined with archaeology, biology, historical textual records, mythology, and religious literature. The ethnic origin of the Jews lie in the Israelites, a confederation of Iron Age Semitic-speaking tribes that inhabited a part of Canaan during the tribal and monarchic periods. Modern Jews are named after and also descended from the southern Israelite Kingdom of Judah. Gary A. Rendsburg links the early Canaanite nomadic pastoralists confederation to the Shasu known to the Egyptians around the 15th century BCE. According to the Hebrew Bible narrative, Jewish history begins with the Biblical patriarchs such as Abraham, his son Isaac, Isaac's son Jacob, and the Biblical matriarchs Sarah, Rebecca, Leah, and Rachel, who lived in Canaan. The twelve sons of Jacob subsequently gave birth to the Twelve Tribes. Jacob and his family migrated to Ancient Egypt after being invited to live with Jacob's son Joseph by the Pharaoh himself. Jacob's descendants were later enslaved until the Exodus, led by Moses. Afterwards, the Israelites conquered Canaan under Moses' successor Joshua, and went through the period of the Biblical judges after the death of Joshua. Through the mediation of Samuel, the Israelites were subject to a king, Saul, who was succeeded by David and then Solomon, after whom the United Monarchy ended and was split into a separate Kingdom of Israel and a Kingdom of Judah. The Kingdom of Judah is described as comprising the tribes of Judah, Benjamin and partially, Levi. They later assimilated remnants of other tribes who migrated there from the northern Kingdom of Israel. In the extra-biblical record, the Israelites become visible as a people between 1200 and 1000 BCE. There is well accepted archeological evidence referring to "Israel" in the Merneptah Stele, which dates to about 1200 BCE, and in the Mesha stele from 840 BCE. It is debated whether a period like that of the Biblical judges occurred and if there ever was a United Monarchy. There is further disagreement about the earliest existence of the Kingdoms of Israel and Judah and their extent and power. Historians agree that a Kingdom of Israel existed by c. 900 BCE,: 169–95 there is a consensus that a Kingdom of Judah existed by c. 700 BCE at least, and recent excavations in Khirbet Qeiyafa have provided strong evidence for dating the Kingdom of Judah to the 10th century BCE. In 587 BCE, Nebuchadnezzar II, King of the Neo-Babylonian Empire, besieged Jerusalem, destroyed the First Temple and deported parts of the Judahite population. Scholars disagree regarding the extent to which the Bible should be accepted as a historical source for early Israelite history. Rendsburg states that there are two approximately equal groups of scholars who debate the historicity of the biblical narrative, the minimalists who largely reject it, and the maximalists who largely accept it, with the minimalists being the more vocal of the two. Some of the leading minimalists reframe the biblical account as constituting the Israelites' inspiring national myth narrative, suggesting that according to the modern archaeological and historical account, the Israelites and their culture did not overtake the region by force, but instead branched out of the Canaanite peoples and culture through the development of a distinct monolatristic—and later monotheistic—religion of Yahwism centered on Yahweh, one of the gods of the Canaanite pantheon. The growth of Yahweh-centric belief, along with a number of cultic practices, gradually gave rise to a distinct Israelite ethnic group, setting them apart from other Canaanites. According to Dever, modern archaeologists have largely discarded the search for evidence of the biblical narrative surrounding the patriarchs and the exodus. According to the maximalist position, the modern archaeological record independently points to a narrative which largely agrees with the biblical account. This narrative provides a testimony of the Israelites as a nomadic people known to the Egyptians as belonging to the Shasu. Over time these nomads left the desert and settled on the central mountain range of the land of Canaan, in simple semi-nomadic settlements in which pig bones are notably absent. This population gradually shifted from a tribal lifestyle to a monarchy. While the archaeological record of the ninth century BCE provides evidence for two monarchies, one in the south under a dynasty founded by a figure named David with its capital in Jerusalem, and one in the north under a dynasty founded by a figure named Omri with its capital in Samaria. It also points to an early monarchic period in which these regions shared material culture and religion, suggesting a common origin. Archaeological finds also provide evidence for the later cooperation of these two kingdoms in their coalition against Aram, and for their destructions by the Assyrians and later by the Babylonians. Genetic studies on Jews show that most Jews worldwide bear a common genetic heritage which originates in the Middle East, and that they share certain genetic traits with other Gentile peoples of the Fertile Crescent. The genetic composition of different Jewish groups shows that Jews share a common gene pool dating back four millennia, as a marker of their common ancestral origin. Despite their long-term separation, Jewish communities maintained their unique commonalities, propensities, and sensibilities in culture, tradition, and language. History The earliest recorded evidence of a people by the name of Israel appears in the Merneptah Stele, which dates to around 1200 BCE. The majority of scholars agree that this text refers to the Israelites, a group that inhabited the central highlands of Canaan, where archaeological evidence shows that hundreds of small settlements were constructed between the 12th and 10th centuries BCE. The Israelites differentiated themselves from neighboring peoples through various distinct characteristics including religious practices, prohibition on intermarriage, and an emphasis on genealogy and family history. In the 10th century BCE, two neighboring Israelite kingdoms—the northern Kingdom of Israel and the southern Kingdom of Judah—emerged. Since their inception, they shared ethnic, cultural, linguistic and religious characteristics despite a complicated relationship. Israel, with its capital mostly in Samaria, was larger and wealthier, and soon developed into a regional power. In contrast, Judah, with its capital in Jerusalem, was less prosperous and covered a smaller, mostly mountainous territory. However, while in Israel the royal succession was often decided by a military coup d'état, resulting in several dynasty changes, political stability in Judah was much greater, as it was ruled by the House of David for the whole four centuries of its existence. Scholars also describe Biblical Jews as a 'proto-nation', in the modern nationalist sense, comparable to classical Greeks, the Gauls and the British Celts. Around 720 BCE, Kingdom of Israel was destroyed when it was conquered by the Neo-Assyrian Empire, which came to dominate the ancient Near East. Under the Assyrian resettlement policy, a significant portion of the northern Israelite population was exiled to Mesopotamia and replaced by immigrants from the same region. During the same period, and throughout the 7th century BCE, the Kingdom of Judah, now under Assyrian vassalage, experienced a period of prosperity and witnessed a significant population growth. This prosperity continued until the Neo-Assyrian king Sennacherib devastated the region of Judah in response to a rebellion in the area, ultimately halting at Jerusalem. Later in the same century, the Assyrians were defeated by the rising Neo-Babylonian Empire, and Judah became its vassal. In 587 BCE, following a revolt in Judah, the Babylonian king Nebuchadnezzar II besieged and destroyed Jerusalem and the First Temple, putting an end to the kingdom. The majority of Jerusalem's residents, including the kingdom's elite, were exiled to Babylon. According to the Book of Ezra, the Persian Cyrus the Great ended the Babylonian exile in 538 BCE, the year after he captured Babylon. The exile ended with the return under Zerubbabel the Prince (so called because he was a descendant of the royal line of David) and Joshua the Priest (a descendant of the line of the former High Priests of the Temple) and their construction of the Second Temple circa 521–516 BCE. As part of the Persian Empire, the former Kingdom of Judah became the province of Judah (Yehud Medinata), with a smaller territory and a reduced population. Judea was under control of the Achaemenids until the fall of their empire in c. 333 BCE to Alexander the Great. After several centuries under foreign imperial rule, the Maccabean Revolt against the Seleucid Empire resulted in an independent Hasmonean kingdom, under which the Jews once again enjoyed political independence for a period spanning from 110 to 63 BCE. Under Hasmonean rule the boundaries of their kingdom were expanded to include not only the land of the historical kingdom of Judah, but also the Galilee and Transjordan. In the beginning of this process the Idumeans, who had infiltrated southern Judea after the destruction of the First Temple, were converted en masse. In 63 BCE, Judea was conquered by the Romans. From 37 BCE to 6 CE, the Romans allowed the Jews to maintain some degree of independence by installing the Herodian dynasty as vassal kings. However, Judea eventually came directly under Roman control and was incorporated into the Roman Empire as the province of Judaea. The Jewish–Roman wars, a series of failed uprisings against Roman rule during the first and second centuries CE, had profound and devastating consequences for the Jewish population of Judaea. The First Jewish–Roman War (66–73/74 CE) culminated in the destruction of Jerusalem and the Second Temple, after which the significantly diminished Jewish population was stripped of political autonomy. A few generations later, the Bar Kokhba revolt (132–136 CE) erupted in response to Roman plans to rebuild Jerusalem as a Roman colony, and, possibly, to restrictions on circumcision. Its violent suppression by the Romans led to the near-total depopulation of Judea, and the demographic and cultural center of Jewish life shifted to Galilee. Jews were subsequently banned from residing in Jerusalem and the surrounding area, and the province of Judaea was renamed Syria Palaestina. These developments effectively ended Jewish efforts to restore political sovereignty in the region for nearly two millennia. Similar upheavals impacted the Jewish communities in the empire's eastern provinces during the Diaspora Revolt (115–117 CE), leading to the near-total destruction of Jewish diaspora communities in Libya, Cyprus and Egypt, including the highly influential community in Alexandria. The destruction of the Second Temple in 70 CE brought profound changes to Judaism. With the Temple's central place in Jewish worship gone, religious practices shifted towards prayer, Torah study (including Oral Torah), and communal gatherings in synagogues. Judaism also lost much of its sectarian nature.: 69 Two of the three main sects that flourished during the late Second Temple period, namely the Sadducees and Essenes, eventually disappeared, while Pharisaic beliefs became the foundational, liturgical, and ritualistic basis of Rabbinic Judaism, which emerged as the prevailing form of Judaism since late antiquity. The Jewish diaspora existed well before the destruction of the Second Temple in 70 CE and had been ongoing for centuries, with the dispersal driven by both forced expulsions and voluntary migrations. In Mesopotamia, a testimony to the beginnings of the Jewish community can be found in Joachin's ration tablets, listing provisions allotted to the exiled Judean king and his family by Nebuchadnezzar II, and further evidence are the Al-Yahudu tablets, dated to the 6th–5th centuries BCE and related to the exiles from Judea arriving after the destruction of the First Temple, though there is ample evidence for the presence of Jews in Babylonia even from 626 BCE. In Egypt, the documents from Elephantine reveal the trials of a community founded by a Persian Jewish garrison at two fortresses on the frontier during the 5th–4th centuries BCE, and according to Josephus the Jewish community in Alexandria existed since the founding of the city in the 4th century BCE by Alexander the Great. By 200 BCE, there were well established Jewish communities both in Egypt and Mesopotamia ("Babylonia" in Jewish sources) and in the two centuries that followed, Jewish populations were also present in Asia Minor, Greece, Macedonia, Cyrene, and, beginning in the middle of the first century BCE, in the city of Rome. Later, in the first centuries CE, as a result of the Jewish-Roman Wars, a large number of Jews were taken as captives, sold into slavery, or compelled to flee from the regions affected by the wars, contributing to the formation and expansion of Jewish communities across the Roman Empire as well as in Arabia and Mesopotamia. After the Bar Kokhba revolt, the Jewish population in Judaea—now significantly reduced— made efforts to recover from the revolt's devastating effects, but never fully regained its former strength. Between the second and fourth centuries CE, the region of Galilee emerged as the primary center of Jewish life in Syria Palaestina, experiencing both demographic growth and cultural development. It was during this period that two central rabbinic texts, the Mishnah and the Jerusalem Talmud, were composed. The Romans recognized the patriarchs—rabbinic sages such as Judah ha-Nasi—as representatives of the Jewish people, granting them a certain degree of autonomy. However, as the Roman Empire gave way to the Christianized Byzantine Empire under Constantine, Jews began to face persecution by both the Church and imperial authorities, Jews came to be persecuted by the church and the authorities, and many immigrated to communities in the diaspora. By the fourth century CE, Jews are believed to have lost their demographic majority in Syria Palaestina. The long-established Jewish community of Mesopotamia, which had been living under Parthian and later Sasanian rule, beyond the confines of the Roman Empire, became an important center of Jewish study as Judea's Jewish population declined. Estimates often place the Babylonian Jewish community of the 3rd to 7th centuries at around one million, making it the largest Jewish diaspora community of that period. Under the political leadership of the exilarch, who was regarded as a royal heir of the House of David, this community had an autonomous status and served as a place of refuge for the Jews of Syria Palaestina. A number of significant Talmudic academies, such as the Nehardea, Pumbedita, and Sura academies, were established in Mesopotamia, and many important Amoraim were active there. The Babylonian Talmud, a centerpiece of Jewish religious law, was compiled in Babylonia in the 3rd to 6th centuries. Jewish diaspora communities are generally described to have coalesced into three major ethnic subdivisions according to where their ancestors settled: the Ashkenazim (initially in the Rhineland and France), the Sephardim (initially in the Iberian Peninsula), and the Mizrahim (Middle East and North Africa). Romaniote Jews, Tunisian Jews, Yemenite Jews, Egyptian Jews, Ethiopian Jews, Bukharan Jews, Mountain Jews, and other groups also predated the arrival of the Sephardic diaspora. During the same period, Jewish communities in the Middle East thrived under Islamic rule, especially in cities like Baghdad, Cairo, and Damascus. In Babylonia, from the 7th to 11th centuries the Pumbedita and Sura academies led the Arab and to an extent the entire Jewish world. The deans and students of said academies defined the Geonic period in Jewish history. Following this period were the Rishonim who lived from the 11th to 15th centuries. Like their European counterparts, Jews in the Middle East and North Africa also faced periods of persecution and discriminatory policies, with the Almohad Caliphate in North Africa and Iberia issuing forced conversion decrees, causing Jews such as Maimonides to seek safety in other regions. Despite experiencing repeated waves of persecution, Ashkenazi Jews in Western Europe worked in a variety of fields, making an impact on their communities' economy and societies. In Francia, for example, figures like Isaac Judaeus and Armentarius occupied prominent social and economic positions. Francia also witnessed the development of a sophisticated tradition of biblical commentary, as exemplified by Rashi and the tosafists. In 1144, the first documented blood libel occurred in Norwich, England, marking an escalation in the pattern of discrimination and violence that Jews had already been subjected to throughout medieval Europe. During the 12th and 13th centuries, Jews faced frequent antisemitic legislation - including laws prescribing distinctive dress - alongside segregation, repeated blood libels, pogroms, and massacres such as the Rhineland Massacres (1066). The Jews of the Holy Roman Empire were designated Servi camerae regis (“servants of the imperial chamber”) by Frederick II, a status that afforded limited protection while simultaneously entangling them in the political struggles between the emperor and the German principalities and cities. Persecution intensified during the Black Death in the mid-14th century, when Jews were accused of poisoning wells and many communities were destroyed. These pressures, combined with major expulsions such as that from England in 1290, gradually pushed Ashkenazi Jewish populations eastward into Poland, Lithuania, and Russia. One of the largest Jewish communities of the Middle Ages was in the Iberian Peninsula, which for a time contained the largest Jewish population in Europe. Iberian Jewry endured discrimination under the Visigoths but saw its fortunes improve under Umayyad rule and later the Taifa kingdoms. During this period, the Jews of Muslim Spain entered a "Golden Age" marked by achievements in Hebrew poetry and literature, religious scholarship, grammar, medicine and science, with leading figures including Hasdai ibn Shaprut, Judah Halevi, Moses ibn Ezra and Solomon ibn Gabirol. Jews also rose to high office, most notably Samuel ibn Naghrillah, a scholar and poet who served as grand vizier and military commander of Granada. The Golden Age ended with the rise of the radical Almoravid and Almohad dynasties, whose persecutions drove many Jews from Iberia (including Maimonides), together with the advancing Reconquista. In 1391, widespread pogroms swept across Spain, leaving thousands dead and forcing mass conversions. The Spanish Inquisition was later established to pursue, torture and execute conversos who continued to practice Judaism in secret, while public disputations were staged to discredit Judaism. In 1492, after the Reconquista, Isabella I of Castile and Ferdinand II of Aragon decreed the expulsion of all Jews who refused conversion, sending an estimated 200,000 into exile in Portugal, Italy, North Africa, and the Ottoman Empire. In 1497, Portugal's Jews, about 30,000, were formally ordered expelled but instead were forcibly converted to retain their economic role. In 1498, some 3,500 Jews were expelled from Navarre. Many converts outwardly adopted Christianity while secretly preserving Jewish practices, becoming crypto-Jews (also known as marranos or anusim), who remained targets of the various Inquisitions for centuries. Following the expulsions from Spain and Portugal in the 1490s, Jewish exiles dispersed across the Mediterranean, Europe, and North Africa. Many settled in the Ottoman Empire—which, replacing the Iberian Peninsula, became home to the world's largest Jewish population—where new communities developed in Anatolia, the Balkans, and the Land of Israel. Cities such as Istanbul and Thessaloniki grew into major Jewish centers, while in 16th-century Safed a flourishing spiritual life took shape. There, Solomon Alkabetz, Moses Cordovero, and Isaac Luria developed influential new schools of Kabbalah, giving powerful impetus to Jewish mysticism, and Joseph Karo composed the Shulchan Aruch, which became a cornerstone of Jewish law. In the 17th century, Portuguese conversos who returned to Judaism and engaged in trade and banking helped establish Amsterdam as a prosperous Jewish center, while also forming communities in cities such as Antwerp and London. This period also witnessed waves of messianic fervor, most notably the rise of the Sabbatean movement in the 1660s, led by Sabbatai Zvi of İzmir, which reverberated throughout the Jewish world. In Eastern Europe, Poland–Lithuania became the principal center of Ashkenazi Jewry, eventually becoming home to the largest Jewish population in the world. Jewish life flourished there from in the early modern era, supported by relative stability, economic opportunity, and strong communal institutions. The mid-17th century brought devastation with the Cossack uprisings in Ukraine, which reversed migration flows and sent refugees westward, yet Poland–Lithuania remained the demographic and cultural heartland of Ashkenazic Jewry. Following the partitions of Poland, most of its Jews came under Russian rule and were confined to the "Pale of Settlement." The 18th century also witnessed new religious and intellectual currents. Hasidism, founded by Baal Shem Tov, emphasized mysticism and piety, while its opponents, the Misnagdim ("opponents") led by the Vilna Gaon, defended rabbinic scholarship and tradition. In Western Europe, during the 1760s and 1770s, the Haskalah (Jewish Enlightenment) emerged in German-speaking lands, where figures such as Moses Mendelssohn promoted secular learning, vernacular literacy, and integration into European society. Elsewhere, Jews began to be re-admitted to Western Europe, including England, where Menasseh ben Israel petitioned Oliver Cromwell for their return. In the Americas, Jews of Sephardic descent first arrived as conversos in Spanish and Portuguese colonies, where many faced trial by Inquisition tribunals for "judaizing." A more durable presence began in Dutch Brazil, where Jews openly practiced their religion and established the first synagogues in the New World, before the Portuguese reconquest forced their dispersal to Amsterdam, the Caribbean, and North America. Sephardic communities took root in Curaçao, Suriname, Jamaica, and Barbados, later joined by Ashkenazi migrants. In North America, Jews were present from the mid-17th century, with New Amsterdam hosting the first organized congregation in 1654. By the time of the American Revolution, small communities in New York, Newport, Philadelphia, Savannah, and Charleston played an active role in the struggle for independence. In the late 19th century, Jews in Western Europe gradually achieved legal emancipation, though social acceptance remained limited by persistent antisemitism and rising nationalism. In Eastern Europe, particularly within the Russian Empire's Pale of Settlement, Jews faced mounting legal restrictions and recurring pogroms. From this environment emerged Zionism, a national revival movement originating in Central and Eastern Europe that sought to re-establish a Jewish polity in the Land of Israel as a means of returning the Jewish people to their ancestral homeland and ending centuries of exile and persecution. This led to waves of Jewish migration to Ottoman-controlled Palestine. Theodor Herzl, who is considered the father of political Zionism, offered his vision of a future Jewish state in his 1896 book Der Judenstaat (The Jewish State); a year later, he presided over the First Zionist Congress. The antisemitism that inflicted Jewish communities in Europe also triggered a mass exodus of 2.8 million Jews to the United States between 1881 and 1924. Despite this, some Jews of Europe and the United States were able to make great achievements in various fields of science and culture. Among the most influential from this period are Albert Einstein in physics, Sigmund Freud in psychology, Franz Kafka in literature, and Irving Berlin in music. Many Nobel Prize winners at this time were Jewish, as is still the case. When Adolf Hitler and the Nazi Party came to power in Germany in 1933, the situation for Jews deteriorated rapidly as a direct result of Nazi policies. Many Jews fled from Europe to Mandatory Palestine, the United States, and the Soviet Union as a result of racial anti-Semitic laws, economic difficulties, and the fear of an impending war. World War II started in 1939, and by 1941, Hitler occupied almost all of Europe. Following the German invasion of the Soviet Union in 1941, the Final Solution—an extensive, organized effort with an unprecedented scope intended to annihilate the Jewish people—began, and resulted in the persecution and murder of Jews in Europe and North Africa. In Poland, three million were murdered in gas chambers in all concentration camps combined, with one million at the Auschwitz camp complex alone. The Holocaust is the name given to this genocide, in which six million Jews in total were systematically murdered. Before and during the Holocaust, enormous numbers of Jews immigrated to Mandatory Palestine. In 1944, the Jewish insurgency in Mandatory Palestine began with the aim of gaining full independence from the United Kingdom. On 14 May 1948, upon the termination of the mandate, David Ben-Gurion declared the creation of the State of Israel, a Jewish and democratic state. Immediately afterwards, all neighboring Arab states invaded, and were resisted by the newly formed Israel Defense Forces. In 1949, the war ended and Israel started building its state and absorbing waves of Aliyah, granting citizenship to Jews all over the world via the Law of Return passed in 1950. However, both the Israeli–Palestinian conflict and wider Arab–Israeli conflict continue to this day. Culture The Jewish people and the religion of Judaism are strongly interrelated. Converts to Judaism have a status within the Jewish people equal to those born into it. However, converts who go on to practice no Judaism are likely to be viewed with skepticism. Mainstream Judaism does not proselytize, and conversion is considered a difficult task. A significant portion of conversions are undertaken by children of mixed marriages, or would-be or current spouses of Jews. The Hebrew Bible, a religious interpretation of the traditions and early history of the Jews, established the first of the Abrahamic religions, which are now practiced by 54 percent of the world. Judaism guides its adherents in both practice and belief, and has been called not only a religion, but also a "way of life," which has made drawing a clear distinction between Judaism, Jewish culture, and Jewish identity rather difficult. Throughout history, in eras and places as diverse as the ancient Hellenic world, in Europe before and after The Age of Enlightenment (see Haskalah), in Islamic Spain and Portugal, in North Africa and the Middle East, India, China, or the contemporary United States and Israel, cultural phenomena have developed that are in some sense characteristically Jewish without being at all specifically religious. Some factors in this come from within Judaism, others from the interaction of Jews or specific communities of Jews with their surroundings, and still others from the inner social and cultural dynamics of the community, as opposed to from the religion itself. This phenomenon has led to considerably different Jewish cultures unique to their own communities. Hebrew is the liturgical language of Judaism (termed lashon ha-kodesh, "the holy tongue"), the language in which most of the Hebrew scriptures (Tanakh) were composed, and the daily speech of the Jewish people for centuries. By the 5th century BCE, Aramaic, a closely related tongue, joined Hebrew as the spoken language in Judea. By the 3rd century BCE, some Jews of the diaspora were speaking Greek. Others, such as in the Jewish communities of Asoristan, known to Jews as Babylonia, were speaking Hebrew and Aramaic, the languages of the Babylonian Talmud. Dialects of these same languages were also used by the Jews of Syria Palaestina at that time.[citation needed] For centuries, Jews worldwide have spoken the local or dominant languages of the regions they migrated to, often developing distinctive dialectal forms or branches that became independent languages. Yiddish is the Judaeo-German language developed by Ashkenazi Jews who migrated to Central Europe. Ladino is the Judaeo-Spanish language developed by Sephardic Jews who migrated to the Iberian Peninsula. Due to many factors, including the impact of the Holocaust on European Jewry, the Jewish exodus from Arab and Muslim countries, and widespread emigration from other Jewish communities around the world, ancient and distinct Jewish languages of several communities, including Judaeo-Georgian, Judaeo-Arabic, Judaeo-Berber, Krymchak, Judaeo-Malayalam and many others, have largely fallen out of use. For over sixteen centuries Hebrew was used almost exclusively as a liturgical language, and as the language in which most books had been written on Judaism, with a few speaking only Hebrew on the Sabbath. Hebrew was revived as a spoken language by Eliezer ben Yehuda, who arrived in Palestine in 1881. It had not been used as a mother tongue since Tannaic times. Modern Hebrew is designated as the "State language" of Israel. Despite efforts to revive Hebrew as the national language of the Jewish people, knowledge of the language is not commonly possessed by Jews worldwide and English has emerged as the lingua franca of the Jewish diaspora. Although many Jews once had sufficient knowledge of Hebrew to study the classic literature, and Jewish languages like Yiddish and Ladino were commonly used as recently as the early 20th century, most Jews lack such knowledge today and English has by and large superseded most Jewish vernaculars. The three most commonly spoken languages among Jews today are Hebrew, English, and Russian. Some Romance languages, particularly French and Spanish, are also widely used. Yiddish has been spoken by more Jews in history than any other language, but it is far less used today following the Holocaust and the adoption of Modern Hebrew by the Zionist movement and the State of Israel. In some places, the mother language of the Jewish community differs from that of the general population or the dominant group. For example, in Quebec, the Ashkenazic majority has adopted English, while the Sephardic minority uses French as its primary language. Similarly, South African Jews adopted English rather than Afrikaans. Due to both Czarist and Soviet policies, Russian has superseded Yiddish as the language of Russian Jews, but these policies have also affected neighboring communities. Today, Russian is the first language for many Jewish communities in a number of Post-Soviet states, such as Ukraine and Uzbekistan,[better source needed] as well as for Ashkenazic Jews in Azerbaijan, Georgia, and Tajikistan. Although communities in North Africa today are small and dwindling, Jews there had shifted from a multilingual group to a monolingual one (or nearly so), speaking French in Algeria, Morocco, and the city of Tunis, while most North Africans continue to use Arabic or Berber as their mother tongue.[citation needed] There is no single governing body for the Jewish community, nor a single authority with responsibility for religious doctrine. Instead, a variety of secular and religious institutions at the local, national, and international levels lead various parts of the Jewish community on a variety of issues. Today, many countries have a Chief Rabbi who serves as a representative of that country's Jewry. Although many Hasidic Jews follow a certain hereditary Hasidic dynasty, there is no one commonly accepted leader of all Hasidic Jews. Many Jews believe that the Messiah will act a unifying leader for Jews and the entire world. A number of modern scholars of nationalism support the existence of Jewish national identity in antiquity. One of them is David Goodblatt, who generally believes in the existence of nationalism before the modern period. In his view, the Bible, the parabiblical literature and the Jewish national history provide the base for a Jewish collective identity. Although many of the ancient Jews were illiterate (as were their neighbors), their national narrative was reinforced through public readings. The Hebrew language also constructed and preserved national identity. Although it was not widely spoken after the 5th century BCE, Goodblatt states: the mere presence of the language in spoken or written form could invoke the concept of a Jewish national identity. Even if one knew no Hebrew or was illiterate, one could recognize that a group of signs was in Hebrew script. ... It was the language of the Israelite ancestors, the national literature, and the national religion. As such it was inseparable from the national identity. Indeed its mere presence in visual or aural medium could invoke that identity. Anthony D. Smith, an historical sociologist considered one of the founders of the field of nationalism studies, wrote that the Jews of the late Second Temple period provide "a closer approximation to the ideal type of the nation [...] than perhaps anywhere else in the ancient world." He adds that this observation "must make us wary of pronouncing too readily against the possibility of the nation, and even a form of religious nationalism, before the onset of modernity." Agreeing with Smith, Goodblatt suggests omitting the qualifier "religious" from Smith's definition of ancient Jewish nationalism, noting that, according to Smith, a religious component in national memories and culture is common even in the modern era. This view is echoed by political scientist Tom Garvin, who writes that "something strangely like modern nationalism is documented for many peoples in medieval times and in classical times as well," citing the ancient Jews as one of several "obvious examples", alongside the classical Greeks and the Gaulish and British Celts. Fergus Millar suggests that the sources of Jewish national identity and their early nationalist movements in the first and second centuries CE included several key elements: the Bible as both a national history and legal source, the Hebrew language as a national language, a system of law, and social institutions such as schools, synagogues, and Sabbath worship. Adrian Hastings argued that Jews are the "true proto-nation", that through the model of ancient Israel found in the Hebrew Bible, provided the world with the original concept of nationhood which later influenced Christian nations. However, following Jerusalem's destruction in the first century CE, Jews ceased to be a political entity and did not resemble a traditional nation-state for almost two millennia. Despite this, they maintained their national identity through collective memory, religion and sacred texts, even without land or political power, and remained a nation rather than just an ethnic group, eventually leading to the rise of Zionism and the establishment of Israel. Steven Weitzman suggests that Jewish nationalist sentiment in antiquity was encouraged because under foreign rule (Persians, Greeks, Romans) Jews were able to claim that they were an ancient nation. This claim was based on the preservation and reverence of their scriptures, the Hebrew language, the Temple and priesthood, and other traditions of their ancestors. Doron Mendels further observes that the Hasmonean kingdom, one of the few examples of indigenous statehood at its time, significantly reinforced Jewish national consciousness. The memory of this period of independence contributed to the persistent efforts to revive Jewish sovereignty in Judea, leading to the major revolts against Roman rule in the 1st and 2nd centuries CE. Demographics Within the world's Jewish population there are distinct ethnic divisions, most of which are primarily the result of geographic branching from an originating Israelite population, and subsequent independent evolutions. An array of Jewish communities was established by Jewish settlers in various places around the Old World, often at great distances from one another, resulting in effective and often long-term isolation. During the millennia of the Jewish diaspora the communities would develop under the influence of their local environments: political, cultural, natural, and populational. Today, manifestations of these differences among the Jews can be observed in Jewish cultural expressions of each community, including Jewish linguistic diversity, culinary preferences, liturgical practices, religious interpretations, as well as degrees and sources of genetic admixture. Jews are often identified as belonging to one of two major groups: the Ashkenazim and the Sephardim. Ashkenazim are so named in reference to their geographical origins (their ancestors' culture coalesced in the Rhineland, an area historically referred to by Jews as Ashkenaz). Similarly, Sephardim (Sefarad meaning "Spain" in Hebrew) are named in reference their origins in Iberia. The diverse groups of Jews of the Middle East and North Africa are often collectively referred to as Sephardim together with Sephardim proper for liturgical reasons having to do with their prayer rites. A common term for many of these non-Spanish Jews who are sometimes still broadly grouped as Sephardim is Mizrahim (lit. 'easterners' in Hebrew). Nevertheless, Mizrahis and Sepharadim are usually ethnically distinct. Smaller groups include, but are not restricted to, Indian Jews such as the Bene Israel, Bnei Menashe, Cochin Jews, and Bene Ephraim; the Romaniotes of Greece; the Italian Jews ("Italkim" or "Bené Roma"); the Teimanim from Yemen; various African Jews, including most numerously the Beta Israel of Ethiopia; and Chinese Jews, most notably the Kaifeng Jews, as well as various other distinct but now almost extinct communities. The divisions between all these groups are approximate and their boundaries are not always clear. The Mizrahim for example, are a heterogeneous collection of North African, Central Asian, Caucasian, and Middle Eastern Jewish communities that are no closer related to each other than they are to any of the earlier mentioned Jewish groups. In modern usage, however, the Mizrahim are sometimes termed Sephardi due to similar styles of liturgy, despite independent development from Sephardim proper. Thus, among Mizrahim there are Egyptian Jews, Iraqi Jews, Lebanese Jews, Kurdish Jews, Moroccan Jews, Libyan Jews, Syrian Jews, Bukharian Jews, Mountain Jews, Georgian Jews, Iranian Jews, Afghan Jews, and various others. The Teimanim from Yemen are sometimes included, although their style of liturgy is unique and they differ in respect to the admixture found among them to that found in Mizrahim. In addition, there is a differentiation made between Sephardi migrants who established themselves in the Middle East and North Africa after the expulsion of the Jews from Spain and Portugal in the 1490s and the pre-existing Jewish communities in those regions. Ashkenazi Jews represent the bulk of modern Jewry, with at least 70 percent of Jews worldwide (and up to 90 percent prior to World War II and the Holocaust). As a result of their emigration from Europe, Ashkenazim also represent the overwhelming majority of Jews in the New World continents, in countries such as the United States, Canada, Argentina, Australia, and Brazil. In France, the immigration of Jews from Algeria (Sephardim) has led them to outnumber the Ashkenazim. Only in Israel is the Jewish population representative of all groups, a melting pot independent of each group's proportion within the overall world Jewish population. Y DNA studies tend to imply a small number of founders in an old population whose members parted and followed different migration paths. In most Jewish populations, these male line ancestors appear to have been mainly Middle Eastern. For example, Ashkenazi Jews share more common paternal lineages with other Jewish and Middle Eastern groups than with non-Jewish populations in areas where Jews lived in Eastern Europe, Germany, and the French Rhine Valley. This is consistent with Jewish traditions in placing most Jewish paternal origins in the region of the Middle East. Conversely, the maternal lineages of Jewish populations, studied by looking at mitochondrial DNA, are generally more heterogeneous. Scholars such as Harry Ostrer and Raphael Falk believe this indicates that many Jewish males found new mates from European and other communities in the places where they migrated in the diaspora after fleeing ancient Israel. In contrast, Behar has found evidence that about 40 percent of Ashkenazi Jews originate maternally from just four female founders, who were of Middle Eastern origin. The populations of Sephardi and Mizrahi Jewish communities "showed no evidence for a narrow founder effect." Subsequent studies carried out by Feder et al. confirmed the large portion of non-local maternal origin among Ashkenazi Jews. Reflecting on their findings related to the maternal origin of Ashkenazi Jews, the authors conclude "Clearly, the differences between Jews and non-Jews are far larger than those observed among the Jewish communities. Hence, differences between the Jewish communities can be overlooked when non-Jews are included in the comparisons." However, a 2025 genetic study on the Ashkenazi Jewish founder population supports the presence of a substantial Near Eastern component in the maternal lineages. Analyses of mitochondrial DNA (mtDNA) indicate that the core founder lineages, estimated at around 54, likely originated from the Near East, with these founder signatures appearing in multiple copies across the population. While later admixture introduced additional mtDNA lineages, these absorbed lineages are distinguishable from the original founders. The findings are consistent with genome-wide Identity-by-Descent and Lineage Extinction analyses, reinforcing the Near Eastern origin of the Ashkenazi maternal founders. A study showed that 7% of Ashkenazi Jews have the haplogroup G2c, which is mainly found in Pashtuns and on lower scales all major Jewish groups, Palestinians, Syrians, and Lebanese. Studies of autosomal DNA, which look at the entire DNA mixture, have become increasingly important as the technology develops. They show that Jewish populations have tended to form relatively closely related groups in independent communities, with most in a community sharing significant ancestry in common. For Jewish populations of the diaspora, the genetic composition of Ashkenazi, Sephardic, and Mizrahi Jewish populations show a predominant amount of shared Middle Eastern ancestry. According to Behar, the most parsimonious explanation for this shared Middle Eastern ancestry is that it is "consistent with the historical formulation of the Jewish people as descending from ancient Hebrew and Israelite residents of the Levant" and "the dispersion of the people of ancient Israel throughout the Old World". North African, Italian and others of Iberian origin show variable frequencies of admixture with non-Jewish historical host populations among the maternal lines. In the case of Ashkenazi and Sephardi Jews (in particular Moroccan Jews), who are closely related, the source of non-Jewish admixture is mainly Southern European, while Mizrahi Jews show evidence of admixture with other Middle Eastern populations. Behar et al. have remarked on a close relationship between Ashkenazi Jews and modern Italians. A 2001 study found that Jews were more closely related to groups of the Fertile Crescent (Kurds, Turks, and Armenians) than to their Arab neighbors, whose genetic signature was found in geographic patterns reflective of Islamic conquests. The studies also show that Sephardic Bnei Anusim (descendants of the "anusim" who were forced to convert to Catholicism), which comprise up to 19.8 percent of the population of today's Iberia (Spain and Portugal) and at least 10 percent of the population of Ibero-America (Hispanic America and Brazil), have Sephardic Jewish ancestry within the last few centuries. The Bene Israel and Cochin Jews of India, Beta Israel of Ethiopia, and a portion of the Lemba people of Southern Africa, despite more closely resembling the local populations of their native countries, have also been thought to have some more remote ancient Jewish ancestry. Views on the Lemba have changed and genetic Y-DNA analyses in the 2000s have established a partially Middle-Eastern origin for a portion of the male Lemba population but have been unable to narrow this down further. Although historically, Jews have been found all over the world, in the decades since World War II and the establishment of Israel, they have increasingly concentrated in a small number of countries. In 2021, Israel and the United States together accounted for over 85 percent of the global Jewish population, with approximately 45.3% and 39.6% of the world's Jews, respectively. More than half (51.2%) of world Jewry resides in just ten metropolitan areas. As of 2021, these ten areas were Tel Aviv, New York, Jerusalem, Haifa, Los Angeles, Miami, Philadelphia, Paris, Washington, and Chicago. The Tel Aviv metro area has the highest percent of Jews among the total population (94.8%), followed by Jerusalem (72.3%), Haifa (73.1%), and Beersheba (60.4%), the balance mostly being Israeli Arabs. Outside Israel, the highest percent of Jews in a metropolitan area was in New York (10.8%), followed by Miami (8.7%), Philadelphia (6.8%), San Francisco (5.1%), Washington (4.7%), Los Angeles (4.7%), Toronto (4.5%), and Baltimore (4.1%). As of 2010, there were nearly 14 million Jews around the world, roughly 0.2% of the world's population at the time. According to the 2007 estimates of The Jewish People Policy Planning Institute, the world's Jewish population is 13.2 million. This statistic incorporates both practicing Jews affiliated with synagogues and the Jewish community, and approximately 4.5 million unaffiliated and secular Jews.[citation needed] According to Sergio Della Pergola, a demographer of the Jewish population, in 2021 there were about 6.8 million Jews in Israel, 6 million in the United States, and 2.3 million in the rest of the world. Israel, the Jewish nation-state, is the only country in which Jews make up a majority of the citizens. Israel was established as an independent democratic and Jewish state on 14 May 1948. Of the 120 members in its parliament, the Knesset, as of 2016[update], 14 members of the Knesset are Arab citizens of Israel (not including the Druze), most representing Arab political parties. One of Israel's Supreme Court judges is also an Arab citizen of Israel. Between 1948 and 1958, the Jewish population rose from 800,000 to two million. Currently, Jews account for 75.4 percent of the Israeli population, or 6 million people. The early years of the State of Israel were marked by the mass immigration of Holocaust survivors in the aftermath of the Holocaust and Jews fleeing Arab lands. Israel also has a large population of Ethiopian Jews, many of whom were airlifted to Israel in the late 1980s and early 1990s. Between 1974 and 1979 nearly 227,258 immigrants arrived in Israel, about half being from the Soviet Union. This period also saw an increase in immigration to Israel from Western Europe, Latin America, and North America. A trickle of immigrants from other communities has also arrived, including Indian Jews and others, as well as some descendants of Ashkenazi Holocaust survivors who had settled in countries such as the United States, Argentina, Australia, Chile, and South Africa. Some Jews have emigrated from Israel elsewhere, because of economic problems or disillusionment with political conditions and the continuing Arab–Israeli conflict. Jewish Israeli emigrants are known as yordim. The waves of immigration to the United States and elsewhere at the turn of the 19th century, the founding of Zionism and later events, including pogroms in Imperial Russia (mostly within the Pale of Settlement in present-day Ukraine, Moldova, Belarus and eastern Poland), the massacre of European Jewry during the Holocaust, and the founding of the state of Israel, with the subsequent Jewish exodus from Arab lands, all resulted in substantial shifts in the population centers of world Jewry by the end of the 20th century. More than half of the Jews live in the Diaspora (see Population table). Currently, the largest Jewish community outside Israel, and either the largest or second-largest Jewish community in the world, is located in the United States, with 6 million to 7.5 million Jews by various estimates. Elsewhere in the Americas, there are also large Jewish populations in Canada (315,000), Argentina (180,000–300,000), and Brazil (196,000–600,000), and smaller populations in Mexico, Uruguay, Venezuela, Chile, Colombia and several other countries (see History of the Jews in Latin America). According to a 2010 Pew Research Center study, about 470,000 people of Jewish heritage live in Latin America and the Caribbean. Demographers disagree on whether the United States has a larger Jewish population than Israel, with many maintaining that Israel surpassed the United States in Jewish population during the 2000s, while others maintain that the United States still has the largest Jewish population in the world. Currently, a major national Jewish population survey is planned to ascertain whether or not Israel has overtaken the United States in Jewish population. Western Europe's largest Jewish community, and the third-largest Jewish community in the world, can be found in France, home to between 483,000 and 500,000 Jews, the majority of whom are immigrants or refugees from North African countries such as Algeria, Morocco, and Tunisia (or their descendants). The United Kingdom has a Jewish community of 292,000. In Eastern Europe, the exact figures are difficult to establish. The number of Jews in Russia varies widely according to whether a source uses census data (which requires a person to choose a single nationality among choices that include "Russian" and "Jewish") or eligibility for immigration to Israel (which requires that a person have one or more Jewish grandparents). According to the latter criteria, the heads of the Russian Jewish community assert that up to 1.5 million Russians are eligible for aliyah. In Germany, the 102,000 Jews registered with the Jewish community are a slowly declining population, despite the immigration of tens of thousands of Jews from the former Soviet Union since the fall of the Berlin Wall. Thousands of Israelis also live in Germany, either permanently or temporarily, for economic reasons. Prior to 1948, approximately 800,000 Jews were living in lands which now make up the Arab world (excluding Israel). Of these, just under two-thirds lived in the French-controlled Maghreb region, 15 to 20 percent in the Kingdom of Iraq, approximately 10 percent in the Kingdom of Egypt and approximately 7 percent in the Kingdom of Yemen. A further 200,000 lived in Pahlavi Iran and the Republic of Turkey. Today, around 26,000 Jews live in Muslim-majority countries, mainly in Turkey (14,200) and Iran (9,100), while Morocco (2,000), Tunisia (1,000), and the United Arab Emirates (500) host the largest communities in the Arab world. A small-scale exodus had begun in many countries in the early decades of the 20th century, although the only substantial aliyah came from Yemen and Syria. The exodus from Arab and Muslim countries took place primarily from 1948. The first large-scale exoduses took place in the late 1940s and early 1950s, primarily in Iraq, Yemen and Libya, with up to 90 percent of these communities leaving within a few years. The peak of the exodus from Egypt occurred in 1956. The exodus in the Maghreb countries peaked in the 1960s. Lebanon was the only Arab country to see a temporary increase in its Jewish population during this period, due to an influx of refugees from other Arab countries, although by the mid-1970s the Jewish community of Lebanon had also dwindled. In the aftermath of the exodus wave from Arab states, an additional migration of Iranian Jews peaked in the 1980s when around 80 percent of Iranian Jews left the country.[citation needed] Outside Europe, the Americas, the Middle East, and the rest of Asia, there are significant Jewish populations in Australia (112,500) and South Africa (70,000). There is also a 6,800-strong community in New Zealand. Since at least the time of the Ancient Greeks, a proportion of Jews have assimilated into the wider non-Jewish society around them, by either choice or force, ceasing to practice Judaism and losing their Jewish identity. Assimilation took place in all areas, and during all time periods, with some Jewish communities, for example the Kaifeng Jews of China, disappearing entirely. The advent of the Jewish Enlightenment of the 18th century (see Haskalah) and the subsequent emancipation of the Jewish populations of Europe and America in the 19th century, accelerated the situation, encouraging Jews to increasingly participate in, and become part of, secular society. The result has been a growing trend of assimilation, as Jews marry non-Jewish spouses and stop participating in the Jewish community. Rates of interreligious marriage vary widely: In the United States, it is just under 50 percent; in the United Kingdom, around 53 percent; in France, around 30 percent; and in Australia and Mexico, as low as 10 percent. In the United States, only about a third of children from intermarriages affiliate with Jewish religious practice. The result is that most countries in the Diaspora have steady or slightly declining religiously Jewish populations as Jews continue to assimilate into the countries in which they live.[citation needed] The Jewish people and Judaism have experienced various persecutions throughout their history. During Late Antiquity and the Early Middle Ages, the Roman Empire (in its later phases known as the Byzantine Empire) repeatedly repressed the Jewish population, first by ejecting them from their homelands during the pagan Roman era and later by officially establishing them as second-class citizens during the Christian Roman era. According to James Carroll, "Jews accounted for 10% of the total population of the Roman Empire. By that ratio, if other factors had not intervened, there would be 200 million Jews in the world today, instead of something like 13 million." Later in medieval Western Europe, further persecutions of Jews by Christians occurred, notably during the Crusades—when Jews all over Germany were massacred—and in a series of expulsions from the Kingdom of England, Germany, and France. Then there occurred the largest expulsion of all, when Spain and Portugal, after the Reconquista (the Catholic Reconquest of the Iberian Peninsula), expelled both unbaptized Sephardic Jews and the ruling Muslim Moors. In the Papal States, which existed until 1870, Jews were required to live only in specified neighborhoods called ghettos. Islam and Judaism have a complex relationship. Traditionally Jews and Christians living in Muslim lands, known as dhimmis, were allowed to practice their religions and administer their internal affairs, but they were subject to certain conditions. They had to pay the jizya (a per capita tax imposed on free adult non-Muslim males) to the Islamic state. Dhimmis had an inferior status under Islamic rule. They had several social and legal disabilities such as prohibitions against bearing arms or giving testimony in courts in cases involving Muslims. Many of the disabilities were highly symbolic. The one described by Bernard Lewis as "most degrading" was the requirement of distinctive clothing, not found in the Quran or hadith but invented in early medieval Baghdad; its enforcement was highly erratic. On the other hand, Jews rarely faced martyrdom or exile, or forced compulsion to change their religion, and they were mostly free in their choice of residence and profession. Notable exceptions include the massacre of Jews and forcible conversion of some Jews by the rulers of the Almohad dynasty in Al-Andalus in the 12th century, as well as in Islamic Persia, and the forced confinement of Moroccan Jews to walled quarters known as mellahs beginning from the 15th century and especially in the early 19th century. In modern times, it has become commonplace for standard antisemitic themes to be conflated with anti-Zionist publications and pronouncements of Islamic movements such as Hezbollah and Hamas, in the pronouncements of various agencies of the Islamic Republic of Iran, and even in the newspapers and other publications of Turkish Refah Partisi."[better source needed] Throughout history, many rulers, empires and nations have oppressed their Jewish populations or sought to eliminate them entirely. Methods employed ranged from expulsion to outright genocide; within nations, often the threat of these extreme methods was sufficient to silence dissent. The history of antisemitism includes the First Crusade which resulted in the massacre of Jews; the Spanish Inquisition (led by Tomás de Torquemada) and the Portuguese Inquisition, with their persecution and autos-da-fé against the New Christians and Marrano Jews; the Bohdan Chmielnicki Cossack massacres in Ukraine; the Pogroms backed by the Russian Tsars; as well as expulsions from Spain, Portugal, England, France, Germany, and other countries in which the Jews had settled. According to a 2008 study published in the American Journal of Human Genetics, 19.8 percent of the modern Iberian population has Sephardic Jewish ancestry, indicating that the number of conversos may have been much higher than originally thought. The persecution reached a peak in Nazi Germany's Final Solution, which led to the Holocaust and the slaughter of approximately 6 million Jews. Of the world's 16 million Jews in 1939, almost 40% were murdered in the Holocaust. The Holocaust—the state-led systematic persecution and genocide of European Jews (and certain communities of North African Jews in European controlled North Africa) and other minority groups of Europe during World War II by Germany and its collaborators—remains the most notable modern-day persecution of Jews. The persecution and genocide were accomplished in stages. Legislation to remove the Jews from civil society was enacted years before the outbreak of World War II. Concentration camps were established in which inmates were used as slave labour until they died of exhaustion or disease. Where the Third Reich conquered new territory in Eastern Europe, specialized units called Einsatzgruppen murdered Jews and political opponents in mass shootings. Jews and Roma were crammed into ghettos before being transported hundreds of kilometres by freight train to extermination camps where, if they survived the journey, the majority of them were murdered in gas chambers. Virtually every arm of Germany's bureaucracy was involved in the logistics of the mass murder, turning the country into what one Holocaust scholar has called "a genocidal nation." Throughout Jewish history, Jews have repeatedly been directly or indirectly expelled from both their original homeland, the Land of Israel, and many of the areas in which they have settled. This experience as refugees has shaped Jewish identity and religious practice in many ways, and is thus a major element of Jewish history. In summary, the pogroms in Eastern Europe, the rise of modern antisemitism, the Holocaust, as well as the rise of Arab nationalism, all served to fuel the movements and migrations of huge segments of Jewry from land to land and continent to continent until they arrived back in large numbers at their original historical homeland in Israel. In the Bible, the patriarch Abraham is described as a migrant to the land of Canaan from Ur of the Chaldees. His descendants, the Children of Israel, undertook the Exodus (meaning "departure" or "exit" in Greek) from ancient Egypt, as described in the Book of Exodus. The first movement documented in the historical record occurred with the resettlement policy of the Neo-Assyrian Empire, which mandated the deportation of conquered peoples, and it is estimated some 4,500,000 among its captive populations suffered this dislocation over three centuries of Assyrian rule. With regard to Israel, Tiglath-Pileser III claims he deported 80% of the population of Lower Galilee, some 13,520 people. Some 27,000 Israelites, 20 to 25% of the population of the Kingdom of Israel, were described as being deported by Sargon II, and were replaced by other deported populations and sent into permanent exile by Assyria, initially to the Upper Mesopotamian provinces of the Assyrian Empire. Between 10,000 and 80,000 people from the Kingdom of Judah were similarly exiled by Babylonia, but these people were then returned to Judea by Cyrus the Great of the Persian Achaemenid Empire. Many Jews were exiled again by the Roman Empire. The 2,000 year dispersion of the Jewish diaspora beginning under the Roman Empire, as Jews were spread throughout the Roman world and, driven from land to land, settled wherever they could live freely enough to practice their religion. Over the course of the diaspora the center of Jewish life moved from Babylonia to the Iberian Peninsula to Poland to the United States and, as a result of Zionism, back to Israel. There were also many expulsions of Jews during the Middle Ages and Enlightenment in Europe, including: 1290, 16,000 Jews were expelled from England, (see the Statute of Jewry); in 1396, 100,000 from France; in 1421, thousands were expelled from Austria. Many of these Jews settled in East-Central Europe, especially Poland. Following the Spanish Inquisition in 1492, the Spanish population of around 200,000 Sephardic Jews were expelled by the Spanish crown and Catholic church, followed by expulsions in 1493 in Sicily (37,000 Jews) and Portugal in 1496. The expelled Jews fled mainly to the Ottoman Empire, the Netherlands, and North Africa, others migrating to Southern Europe and the Middle East. During the 19th century, France's policies of equal citizenship regardless of religion led to the immigration of Jews (especially from Eastern and Central Europe). This contributed to the arrival of millions of Jews in the New World. Over two million Eastern European Jews arrived in the United States from 1880 to 1925. In the latest phase of migrations, the Islamic Revolution of Iran caused many Iranian Jews to flee Iran. Most found refuge in the US (particularly Los Angeles, California, and Long Island, New York) and Israel. Smaller communities of Persian Jews exist in Canada and Western Europe. Similarly, when the Soviet Union collapsed, many of the Jews in the affected territory (who had been refuseniks) were suddenly allowed to leave. This produced a wave of migration to Israel in the early 1990s. Israel is the only country with a Jewish population that is consistently growing through natural population growth, although the Jewish populations of other countries, in Europe and North America, have recently increased through immigration. In the Diaspora, in almost every country the Jewish population in general is either declining or steady, but Orthodox and Haredi Jewish communities, whose members often shun birth control for religious reasons, have experienced rapid population growth. Orthodox and Conservative Judaism discourage proselytism to non-Jews, but many Jewish groups have tried to reach out to the assimilated Jewish communities of the Diaspora in order for them to reconnect to their Jewish roots. Additionally, while in principle Reform Judaism favours seeking new members for the faith, this position has not translated into active proselytism, instead taking the form of an effort to reach out to non-Jewish spouses of intermarried couples. There is also a trend of Orthodox movements reaching out to secular Jews in order to give them a stronger Jewish identity so there is less chance of intermarriage. As a result of the efforts by these and other Jewish groups over the past 25 years, there has been a trend (known as the Baal teshuva movement) for secular Jews to become more religiously observant, though the demographic implications of the trend are unknown. Additionally, there is also a growing rate of conversion to Jews by Choice of gentiles who make the decision to head in the direction of becoming Jews. Contributions Jewish individuals have played a significant role in the development and growth of Western culture, advancing many fields of thought, science and technology, both historically and in modern times, including through discrete trends in Jewish philosophy, Jewish ethics and Jewish literature, as well as specific trends in Jewish culture, including in Jewish art, Jewish music, Jewish humor, Jewish theatre, Jewish cuisine and Jewish medicine. Jews have established various Jewish political movements, religious movements, and, through the authorship of the Hebrew Bible and parts of the New Testament, provided the foundation for Christianity and Islam. More than 20 percent of the awarded Nobel Prize have gone to individuals of Jewish descent. Philanthropic giving is a widespread core function among Jewish organizations. Notes References External links |
======================================== |
[SOURCE: https://www.mako.co.il/makoz-news/Article-d25f321e5d67c91026.htm] | [TOKENS: 93] |
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/makoz-news/Article-d25f321e5d67c91026.htm from 79.181.162.231 on February 21 2026, 10:56:13 UTC |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.