text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/MMCD] | [TOKENS: 4757] |
Contents DVD DVD (digital video disc or digital versatile disc) is a digital optical disc data storage format. It was invented and developed in 1995 and first released on November 1, 1996, in Japan. The medium can store any kind of digital data and has been widely used to store video programs (watched using DVD players), software and other computer files. DVDs offer significantly higher storage capacity than compact discs (CD) while having the same dimensions. A standard single-layer DVD can store up to 4.7 GB of data, a dual-layer DVD up to 8.5 GB. Dual-layer, double-sided DVDs can store up to a maximum of 17.08 GB. Prerecorded DVDs are mass-produced using molding machines that physically stamp data onto the DVD. Such discs are a form of DVD-ROM because data can only be read and not written or erased. Blank recordable DVD discs (DVD-R and DVD+R) can be recorded once using a DVD recorder and then function as a DVD-ROM. Rewritable DVDs (DVD-RW, DVD+RW, and DVD-RAM) can be recorded and erased many times. DVDs are used in DVD-Video consumer digital video format and less commonly in DVD-Audio consumer digital audio format, as well as for authoring DVD discs written in a special AVCHD format to hold high definition material (often in conjunction with AVCHD format camcorders). DVDs containing other types of information may be referred to as DVD data discs. Etymology The Oxford English Dictionary comments that, "In 1995, rival manufacturers of the product initially named digital video disc agreed that, in order to emphasize the flexibility of the format for multimedia applications, the preferred abbreviation DVD would be understood to denote digital versatile disc." The OED also states that in 1995, "The companies said the official name of the format will simply be DVD. Toshiba had been using the name 'digital video disc', but that was switched to 'digital versatile disc' after computer companies complained that it left out their applications." "Digital versatile disc" is the explanation provided in a DVD Forum Primer from 2000 and in the DVD Forum's mission statement, which the purpose is to promote broad acceptance of DVD products on technology, across entertainment, and other industries. Because DVDs became highly popular for the distribution of movies in the 2000s, the term DVD became popularly used in English as a noun to describe specifically a full-length movie released on the format; for example the phrase "to watch a DVD" describes watching a movie on DVD. History Released in 1987, CD Video used analog video encoding on optical discs matching the established standard 120 mm (4.7 in) size of audio CDs. Video CD (VCD) became one of the first formats for distributing digitally encoded films in this format, in 1993. In the same year, two new optical disc storage formats were being developed. One was the Multimedia Compact Disc (MMCD), backed by Philips and Sony (developers of the CD and CD-i), and the other was the Super Density (SD) disc, supported by Toshiba, Time Warner, Matsushita Electric, Hitachi, Mitsubishi Electric, Pioneer, Thomson, and JVC. By the time of the press launches for both formats in January 1995, the MMCD nomenclature had been dropped, and Philips and Sony were referring to their format as Digital Video Disc (DVD). On May 3, 1995, an ad hoc, industry technical group formed from five computer companies (IBM, Apple, Compaq, Hewlett-Packard, and Microsoft) issued a press release stating that they would only accept a single format. The group voted to boycott both formats unless the two camps agreed on a single, converged standard. They recruited Lou Gerstner, president of IBM, to pressure the executives of the warring factions. In one significant compromise, the MMCD and SD groups agreed to adopt proposal SD 9, which specified that both layers of the dual-layered disc be read from the same side—instead of proposal SD 10, which would have created a two-sided disc that users would have to turn over. Philips/Sony strongly insisted on the source code, EFMPlus, that Kees Schouhamer Immink had designed for the MMCD, because it makes it possible to apply the existing CD servo technology. Its drawback was a loss from 5 to 4.7 Gbyte of capacity. As a result, the DVD specification provided a storage capacity of 4.7 GB (4.38 GiB)[a] for a single-layered, single-sided disc and 8.5 GB (7.92 GiB) for a dual-layered, single-sided disc. The DVD specification ended up similar to Toshiba and Matsushita's Super Density Disc, except for the dual-layer option. MMCD was single-sided and optionally dual-layer, whereas SD was two half-thickness, single-layer discs which were pressed separately and then glued together to form a double-sided disc. Philips and Sony decided that it was in their best interests to end the format war, and on September 15, 1995 agreed to unify with companies backing the Super Density Disc to release a single format, with technologies from both. After other compromises between MMCD and SD, the group of computer companies won the day, and a single format was agreed upon. The computer companies also collaborated with the Optical Storage Technology Association (OSTA) on the use of their implementation of the ISO-13346 file system (known as Universal Disk Format) for use on the new DVDs. The format's details were finalized on December 8, 1995. In November 1995, Samsung announced it would start mass-producing DVDs by September 1996. The format launched on November 1, 1996, in Japan, mostly with music video releases. The first major releases from Warner Home Video arrived on December 20, 1996, with four titles being available.[b] The format's release in the U.S. was delayed multiple times, from August 1996, to October 1996, November 1996, before finally settling on early 1997. Players began to be produced domestically that winter, with March 24, 1997, as the U.S. launch date of the format proper in seven test markets.[c] Approximately 32 titles were available on launch day, mainly from the Warner Bros., MGM, and New Line libraries,[d] with the notable inclusion of the 1996 film Twister. However, the launch was planned for the following day (March 25), leading to a distribution change with retailers and studios to prevent similar violations of breaking the street date. The nationwide rollout for the format happened on August 22, 1997.[better source needed] DTS announced in late 1997 that they would be coming onto the format. The sound system company revealed details in a November 1997 online interview, and clarified it would release discs in early 1998. However, this date would be pushed back several times before finally releasing their first titles at the 1999 Consumer Electronics Show. In 2001, blank DVD recordable discs cost the equivalent of $27.34 US dollars in 2022. Movie and home entertainment distributors adopted the DVD format to replace the ubiquitous VHS tape as the primary consumer video distribution format. Immediately following the formal adoption of a unified standard for DVD, two of the four leading video game console companies (Sega and The 3DO Company) said they already had plans to design a gaming console with DVDs as the source medium. Sony stated at the time that they had no plans to use DVD in their gaming systems, despite being one of the developers of the DVD format and eventually the first company to actually release a DVD-based console. Game consoles such as the PlayStation 2, Xbox, and Xbox 360 use DVDs as their source medium for games and other software. Contemporary games for Windows were also distributed on DVD. Early DVDs were mastered using DLT tape, but using DVD-R DL or +R DL eventually became common. TV DVD combos, combining a standard definition CRT TV or an HD flat panel TV with a DVD mechanism under the CRT or on the back of the flat panel, and VCR/DVD combos were also available for purchase. For consumers, DVD soon overtook VHS as the favored choice for home movie releases. In 2001, DVD players outsold VCRs for the first time in the United States. At that time, one in four American households owned a DVD player. By 2007, about 80% of Americans owned a DVD player, a figure that had surpassed VCRs; it was also higher than personal computers or cable television. Specifications The DVD specifications created and updated by the DVD Forum are published as so-called DVD Books (e.g. DVD-ROM Book, DVD-Audio Book, DVD-Video Book, DVD-R Book, DVD-RW Book, DVD-RAM Book, DVD-AR (Audio Recording) Book, DVD-VR (Video Recording) Book, etc.). DVD discs are made up of two discs; normally one is blank, and the other contains data. Each disc is 0.6 mm thick, and they are glued together to form a DVD disc. The gluing process must be done carefully to make the disc as flat as possible to avoid both birefringence and "disc tilt", which is when the disc is not perfectly flat, preventing it from being read. Some specifications for mechanical, physical and optical characteristics of DVD optical discs can be downloaded as freely available standards from the ISO website. There are also equivalent European Computer Manufacturers Association (Ecma) standards for some of these specifications, such as Ecma-267 for DVD-ROMs. Also, the DVD+RW Alliance publishes competing recordable DVD specifications such as DVD+R, DVD+R DL, DVD+RW or DVD+RW DL. These DVD formats are also ISO standards. Some DVD specifications (e.g. for DVD-Video) are not publicly available and can be obtained only from the DVD Format/Logo Licensing Corporation (DVD FLLC) for a fee of US$5000. Every subscriber must sign a non-disclosure agreement as certain information on the DVD Books is proprietary and confidential. In January 2025, the DVD FLLC announced its own dissolution on January 31, 2025 (together with the DVD Forum itself, according to its charter) and the deposit of the DVD specifications at the National Diet Library of Japan in early 2025. As of August 2025, the specification documents are only available at the National Diet Library in Tokyo. Borrowing from the LaserDisc format, the DVD standard includes DVD-10 discs (Type B in ISO) with two recorded data layers such that only one layer is accessible from either side of the disc. This doubles the total nominal capacity of a DVD-10 disc to 9.4 GB (8.75 GiB), but each side is locked to 4.7 GB. Like DVD-5 discs, DVD-10 discs are defined as single-layer (SL) discs. DVD hardware accesses the additional layer (layer 1) by refocusing the laser through an otherwise normally-placed, semitransparent first layer (layer 0). This laser refocus—and the subsequent time needed to reacquire laser tracking—can cause a noticeable pause in A/V playback on earlier DVD players, the length of which varies between hardware. A printed message explaining that the layer-transition pause was not a malfunction became standard on DVD keep cases. During mastering, a studio could make the transition less obvious by timing it to occur just before a camera angle change or other abrupt shift, an early example being the DVD release of Toy Story. Later in the format's life, larger data buffers and faster optical pickups in DVD players made layer transitions effectively invisible regardless of mastering.[citation needed] Dual-layer DVDs are recorded using Opposite Track Path (OTP). The DVD Book also permits an additional disc type called DVD-14: a hybrid double-sided disc with one dual-layer side, one single-layer side, and a total nominal capacity of 12.3 GB. DVD-14 has no counterpart in ISO. Both of these additional disc types are extremely rare due to their complicated and expensive manufacturing. For this reason, some DVDs that were initially issued as double-sided discs were later pressed as two-disc sets. Note: The above sections regarding disc types pertain to 12 cm discs. The same disc types exist for 8 cm discs: ISO standards still regard these discs as Types A–D, while the DVD Book assigns them distinct disc types. DVD-14 has no analogous 8 cm type. The comparative data for 8 cm discs is provided further down. DVD recordable and rewritable HP initially developed recordable DVD media from the need to store data for backup and transport.[failed verification] DVD recordables are now also used for consumer audio and video recording. Three formats were developed: DVD-R/RW, DVD+R/RW (plus), and DVD-RAM. DVD-R is available in two formats, General (650 nm) and Authoring (635 nm), where Authoring discs may be recorded with CSS encrypted video content but General discs may not. Dual-layer recording (occasionally called double-layer recording) allows DVD-R and DVD+R discs to store nearly double the data of a single-layer disc—8.5 and 4.7 gigabyte capacities, respectively. The additional capacity comes at a cost: DVD±DLs have slower write speeds as compared to DVD±R. DVD-R DL was developed for the DVD Forum by Pioneer Corporation; DVD+R DL was developed for the DVD+RW Alliance by Mitsubishi Kagaku Media (MKM) and Philips. Recordable DVD discs supporting dual-layer technology are backward-compatible with some hardware developed before the recordable medium. Capacity DVD drives and players DVD drives are devices that can read DVD discs on a computer. DVD players are a particular type of devices that do not require a computer to work, and can read DVD-Video and DVD-Audio discs. Read and write speeds for the first DVD drives and players were 1,385 kB/s (1,353 KiB/s); this speed is usually called "1×". More recent models, at 18× or 20×, have 18 or 20 times that speed. For CD drives, 1× means 153.6 kB/s (150 KiB/s), about one-ninth as swift. DVDs can spin at much higher speeds than CDs – DVDs can spin at up to 32000 RPM vs 23000 for CDs. In practice, they are not spun by optical drives anywhere close to these speeds to provide a safety margin. DVD drives limit reading speed to 16× (constant angular velocity), which means 9280 rotations per minute. Early-generation drives released before the mid-2000s have lower limits. DVD recordable and rewritable discs can be read and written using either constant angular velocity (CAV), constant linear velocity (CLV), Partial constant angular velocity (P-CAV) or Zoned Constant Linear Velocity (Z-CLV or ZCLV). Due to the slightly lower data density of dual layer DVDs (4.25 GB instead of 4.7 GB per layer), the required rotation speed is around 10% faster for the same data rate, which means that the same angular speed rating equals a 10% higher physical angular rotation speed. For that reason, the increase of reading speeds of dual layer media has stagnated at 12× (constant angular velocity) for half-height optical drives released since around 2005,[g] and slim type optical drives are only able to record dual layer media at 6× (constant angular velocity), while reading speeds of 8× are still supported by such. The quality and data integrity of optical media is measureable, which means that future data losses caused by deteriorating media can be predicted well in advance by measuring the rate of correctable data errors. The types of errors that can occur on a DVD are a PIE (Parity Inner Error), 8PIE (Parity Inner Sum Eight Error), PIF (Parity Inner Failure), POE (Parity Outer Error), and POF (Parity Outer Failure), the last of which indicates data loss. Too many small errors within a small space create a POF condition. The difference between POE and POF is that a POE is generated on a first failed read attempt whereas a POF indicates an uncorrectable error after repeated attempts to read the data. Support of measuring the disc quality varies among optical drive vendors and models. Unreadable data can be found using any drive using general-purpose tools like badblocks. DVD-Video DVD-Video is a standard for distributing video/audio content on DVD media. The format went on sale in Japan on November 1, 1996, in the United States on March 24, 1997, to line up with the 69th Academy Awards that day; in Canada, Central America, and Indonesia later in 1997; and in Europe, Australia, and Africa in 1998. DVD-Video became the dominant form of home video distribution in Japan when it first went on sale on November 1, 1996, but it shared the market for home video distribution in the United States for several years; it was June 15, 2003, when weekly DVD-Video in the United States rentals began outnumbering weekly VHS cassette rentals. The purpose of CSS is twofold: Successors and decline In 2006, two new formats called HD DVD and Blu-ray Disc were released as the successor to DVD. HD DVD competed unsuccessfully with Blu-ray Disc in the format war of 2006–2008. A dual layer HD DVD can store up to 30 GB and a dual layer Blu-ray disc can hold up to 50 GB. However, unlike previous format changes, e.g., vinyl to Compact Disc or VHS videotape to DVD, initially there was no immediate indication that production of the standard DVD will gradually wind down, as at the beginning of the 2010s they still dominated, with around 75% of video sales and approximately one billion DVD player sales worldwide as of April 2011. In fact, experts claimed that the DVD would remain the dominant medium for at least another five years as Blu-ray technology was still in its introductory phase, write and read speeds being poor and necessary hardware being expensive and not readily available. Consumers initially were also slow to adopt Blu-ray due to the cost. By 2009, 85% of stores were selling Blu-ray Discs. A high-definition television and appropriate connection cables are also required to take advantage of Blu-ray disc. Some analysts suggested that the biggest obstacle to replacing DVD was due to its installed base; a large majority of consumers were satisfied with DVDs. DVDs started to face competition from video on demand services around 2015 or 2016. With increasing numbers of homes having high speed Internet connections, many people had the option to either rent or buy video from an online service, and view it by streaming it directly from that service's servers, meaning they no longer need any form of permanent storage media for video at all. By 2017, digital streaming services had overtaken the sales of DVDs and Blu-rays for the first time. Until the end of the 2010s, manufacturers continued to release standard DVD titles, and the format remained the preferred one for the release of older television programs and films. Shows that were shot and edited entirely on film, such as Star Trek: The Original Series, could not be released in high definition without being re-scanned from the original film recordings. Shows that were made between the early 1980s and the early 2000s were generally shot on film, then transferred to video tape, and then edited natively in either NTSC or PAL; this makes high-definition transfers impossible, as these SD standards were baked into the final cuts of the episodes. Star Trek: The Next Generation was the only such show that had a Blu-ray release, as prints were re-scanned and edited from the ground up. By the beginning of the 2020s, sales of DVD had dropped 86% with respect to the peak of DVD sales around 2005, while on-demand sales and, overall, subscription streaming of TV shows and movies grew by over 1,200%. At its peak, DVD sales represented almost two thirds of video market in the US; approximately 15 years later, around 2020, they fell to only 10% of the market. By 2022, there was an increased demand of high definition media, where Ultra HD Blu-ray and regular Blu-ray formats made up for almost half of the US market while sales of physical media continued to shrink in favor of streaming services. The decline continued further into the 2020s with the closure of RedBox in 2024, Best Buy and Target stopped selling DVDs in 2023, and the cease of by mail service by Netflix in 2025.[citation needed]. Longevity Longevity of a storage medium is measured by how long the data remains readable, assuming compatible devices exist that can read it: that is, how long the disc can be stored until data is lost. Numerous factors affect longevity: composition and quality of the media (recording and substrate layers), humidity and light storage conditions, the quality of the initial recording (which is sometimes a matter of mutual compatibility of media and recorder), etc. According to NIST, "[a] temperature of 64.4 °F (18 °C) and 40% RH [Relative Humidity] would be considered suitable for long-term storage. A lower temperature and RH is recommended for extended-term storage." As with CDs, the information and data storage will begin to degrade over time with most standard DVDs lasting up to 30 years depending on the type of environment they are stored and whether they are full with data. According to the Optical Storage Technology Association (OSTA), "Manufacturers claim lifespans ranging from 30 to 100 years for DVD, DVD-R and DVD+R discs and up to 30 years for DVD-RW, DVD+RW and DVD-RAM." According to a NIST/LoC research project conducted in 2005–2007 using accelerated life testing, "There were fifteen DVD products tested, including five DVD-R, five DVD+R, two DVD-RW and three DVD+RW types. There were ninety samples tested for each product. ... Overall, seven of the products tested had estimated life expectancies in ambient conditions of more than 45 years. Four products had estimated life expectancies of 30–45 years in ambient storage conditions. Two products had an estimated life expectancy of 15–30 years and two products had estimated life expectancies of less than 15 years when stored in ambient conditions." The life expectancies for 95% survival estimated in this project by type of product are tabulated below:[dubious – discuss] See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-Toonkel-2025-55] | [TOKENS: 8773] |
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Lod#cite_note-57] | [TOKENS: 4733] |
Contents Lod Lod (Hebrew: לוד, fully vocalized: לֹד), also known as Lydda (Ancient Greek: Λύδδα) and Lidd (Arabic: اللِّدّ, romanized: al-Lidd, or اللُّدّ, al-Ludd), is a city 15 km (9+1⁄2 mi) southeast of Tel Aviv and 40 km (25 mi) northwest of Jerusalem in the Central District of Israel. It is situated between the lower Shephelah on the east and the coastal plain on the west. The city had a population of 90,814 in 2023. Lod has been inhabited since at least the Neolithic period. It is mentioned a few times in the Hebrew Bible and in the New Testament. Between the 5th century BCE and up until the late Roman period, it was a prominent center for Jewish scholarship and trade. Around 200 CE, the city became a Roman colony and was renamed Diospolis (Ancient Greek: Διόσπολις, lit. 'city of Zeus'). Tradition identifies Lod as the 4th century martyrdom site of Saint George; the Church of Saint George and Mosque of Al-Khadr located in the city is believed to have housed his remains. Following the Arab conquest of the Levant, Lod served as the capital of Jund Filastin; however, a few decades later, the seat of power was transferred to Ramla, and Lod slipped in importance. Under Crusader rule, the city was a Catholic diocese of the Latin Church and it remains a titular see to this day.[citation needed] Lod underwent a major change in its population in the mid-20th century. Exclusively Palestinian Arab in 1947, Lod was part of the area designated for an Arab state in the United Nations Partition Plan for Palestine; however, in July 1948, the city was occupied by the Israel Defense Forces, and most of its Arab inhabitants were expelled in the Palestinian expulsion from Lydda and Ramle. The city was largely resettled by Jewish immigrants, most of them expelled from Arab countries. Today, Lod is one of Israel's mixed cities, with an Arab population of 30%. Lod is one of Israel's major transportation hubs. The main international airport, Ben Gurion Airport, is located 8 km (5 miles) north of the city. The city is also a major railway and road junction. Religious references The Hebrew name Lod appears in the Hebrew Bible as a town of Benjamin, founded along with Ono by Shamed or Shamer (1 Chronicles 8:12; Ezra 2:33; Nehemiah 7:37; 11:35). In Ezra 2:33, it is mentioned as one of the cities whose inhabitants returned after the Babylonian captivity. Lod is not mentioned among the towns allocated to the tribe of Benjamin in Joshua 18:11–28. The name Lod derives from a tri-consonantal root not extant in Northwest Semitic, but only in Arabic (“to quarrel; withhold, hinder”). An Arabic etymology of such an ancient name is unlikely (the earliest attestation is from the Achaemenid period). In the New Testament, the town appears in its Greek form, Lydda, as the site of Peter's healing of Aeneas in Acts 9:32–38. The city is also mentioned in an Islamic hadith as the location of the battlefield where the false messiah (al-Masih ad-Dajjal) will be slain before the Day of Judgment. History The first occupation dates to the Neolithic in the Near East and is associated with the Lodian culture. Occupation continued in the Levant Chalcolithic. Pottery finds have dated the initial settlement in the area now occupied by the town to 5600–5250 BCE. In the Early Bronze, it was an important settlement in the central coastal plain between the Judean Shephelah and the Mediterranean coast, along Nahal Ayalon. Other important nearby sites were Tel Dalit, Tel Bareqet, Khirbat Abu Hamid (Shoham North), Tel Afeq, Azor and Jaffa. Two architectural phases belong to the late EB I in Area B. The first phase had a mudbrick wall, while the late phase included a circulat stone structure. Later excavations have produced an occupation later, Stratum IV. It consists of two phases, Stratum IVb with mudbrick wall on stone foundations and rounded exterior corners. In Stratum IVa there was a mudbrick wall with no stone foundations, with imported Egyptian potter and local pottery imitations. Another excavations revealed nine occupation strata. Strata VI-III belonged to Early Bronze IB. The material culture showed Egyptian imports in strata V and IV. Occupation continued into Early Bronze II with four strata (V-II). There was continuity in the material culture and indications of centralized urban planning. North to the tell were scattered MB II burials. The earliest written record is in a list of Canaanite towns drawn up by the Egyptian pharaoh Thutmose III at Karnak in 1465 BCE. From the fifth century BCE until the Roman period, the city was a centre of Jewish scholarship and commerce. According to British historian Martin Gilbert, during the Hasmonean period, Jonathan Maccabee and his brother, Simon Maccabaeus, enlarged the area under Jewish control, which included conquering the city. The Jewish community in Lod during the Mishnah and Talmud era is described in a significant number of sources, including information on its institutions, demographics, and way of life. The city reached its height as a Jewish center between the First Jewish-Roman War and the Bar Kokhba revolt, and again in the days of Judah ha-Nasi and the start of the Amoraim period. The city was then the site of numerous public institutions, including schools, study houses, and synagogues. In 43 BC, Cassius, the Roman governor of Syria, sold the inhabitants of Lod into slavery, but they were set free two years later by Mark Antony. During the First Jewish–Roman War, the Roman proconsul of Syria, Cestius Gallus, razed the town on his way to Jerusalem in Tishrei 66 CE. According to Josephus, "[he] found the city deserted, for the entire population had gone up to Jerusalem for the Feast of Tabernacles. He killed fifty people whom he found, burned the town and marched on". Lydda was occupied by Emperor Vespasian in 68 CE. In the period following the destruction of Jerusalem in 70 CE, Rabbi Tarfon, who appears in many Tannaitic and Jewish legal discussions, served as a rabbinic authority in Lod. During the Kitos War, 115–117 CE, the Roman army laid siege to Lod, where the rebel Jews had gathered under the leadership of Julian and Pappos. Torah study was outlawed by the Romans and pursued mostly in the underground. The distress became so great, the patriarch Rabban Gamaliel II, who was shut up there and died soon afterwards, permitted fasting on Ḥanukkah. Other rabbis disagreed with this ruling. Lydda was next taken and many of the Jews were executed; the "slain of Lydda" are often mentioned in words of reverential praise in the Talmud. In 200 CE, emperor Septimius Severus elevated the town to the status of a city, calling it Colonia Lucia Septimia Severa Diospolis. The name Diospolis ("City of Zeus") may have been bestowed earlier, possibly by Hadrian. At that point, most of its inhabitants were Christian. The earliest known bishop is Aëtius, a friend of Arius. During the following century (200-300CE), it's said that Joshua ben Levi founded a yeshiva in Lod. In December 415, the Council of Diospolis was held here to try Pelagius; he was acquitted. In the sixth century, the city was renamed Georgiopolis after St. George, a soldier in the guard of the emperor Diocletian, who was born there between 256 and 285 CE. The Church of Saint George and Mosque of Al-Khadr is named for him. The 6th-century Madaba map shows Lydda as an unwalled city with a cluster of buildings under a black inscription reading "Lod, also Lydea, also Diospolis". An isolated large building with a semicircular colonnaded plaza in front of it might represent the St George shrine. After the Muslim conquest of Palestine by Amr ibn al-'As in 636 CE, Lod which was referred to as "al-Ludd" in Arabic served as the capital of Jund Filastin ("Military District of Palaestina") before the seat of power was moved to nearby Ramla during the reign of the Umayyad Caliph Suleiman ibn Abd al-Malik in 715–716. The population of al-Ludd was relocated to Ramla, as well. With the relocation of its inhabitants and the construction of the White Mosque in Ramla, al-Ludd lost its importance and fell into decay. The city was visited by the local Arab geographer al-Muqaddasi in 985, when it was under the Fatimid Caliphate, and was noted for its Great Mosque which served the residents of al-Ludd, Ramla, and the nearby villages. He also wrote of the city's "wonderful church (of St. George) at the gate of which Christ will slay the Antichrist." The Crusaders occupied the city in 1099 and named it St Jorge de Lidde. It was briefly conquered by Saladin, but retaken by the Crusaders in 1191. For the English Crusaders, it was a place of great significance as the birthplace of Saint George. The Crusaders made it the seat of a Latin Church diocese, and it remains a titular see. It owed the service of 10 knights and 20 sergeants, and it had its own burgess court during this era. In 1226, Ayyubid Syrian geographer Yaqut al-Hamawi visited al-Ludd and stated it was part of the Jerusalem District during Ayyubid rule. Sultan Baybars brought Lydda again under Muslim control by 1267–8. According to Qalqashandi, Lydda was an administrative centre of a wilaya during the fourteenth and fifteenth century in the Mamluk empire. Mujir al-Din described it as a pleasant village with an active Friday mosque. During this time, Lydda was a station on the postal route between Cairo and Damascus. In 1517, Lydda was incorporated into the Ottoman Empire as part of the Damascus Eyalet, and in the 1550s, the revenues of Lydda were designated for the new waqf of Hasseki Sultan Imaret in Jerusalem, established by Hasseki Hurrem Sultan (Roxelana), the wife of Suleiman the Magnificent. By 1596 Lydda was a part of the nahiya ("subdistrict") of Ramla, which was under the administration of the liwa ("district") of Gaza. It had a population of 241 households and 14 bachelors who were all Muslims, and 233 households who were Christians. They paid a fixed tax-rate of 33,3 % on agricultural products, including wheat, barley, summer crops, vineyards, fruit trees, sesame, special product ("dawalib" =spinning wheels), goats and beehives, in addition to occasional revenues and market toll, a total of 45,000 Akçe. All of the revenue went to the Waqf. In 1051 AH/1641/2, the Bedouin tribe of al-Sawālima from around Jaffa attacked the villages of Subṭāra, Bayt Dajan, al-Sāfiriya, Jindās, Lydda and Yāzūr belonging to Waqf Haseki Sultan. The village appeared as Lydda, though misplaced, on the map of Pierre Jacotin compiled in 1799. Missionary William M. Thomson visited Lydda in the mid-19th century, describing it as a "flourishing village of some 2,000 inhabitants, imbosomed in noble orchards of olive, fig, pomegranate, mulberry, sycamore, and other trees, surrounded every way by a very fertile neighbourhood. The inhabitants are evidently industrious and thriving, and the whole country between this and Ramleh is fast being filled up with their flourishing orchards. Rarely have I beheld a rural scene more delightful than this presented in early harvest ... It must be seen, heard, and enjoyed to be appreciated." In 1869, the population of Ludd was given as: 55 Catholics, 1,940 "Greeks", 5 Protestants and 4,850 Muslims. In 1870, the Church of Saint George was rebuilt. In 1892, the first railway station in the entire region was established in the city. In the second half of the 19th century, Jewish merchants migrated to the city, but left after the 1921 Jaffa riots. In 1882, the Palestine Exploration Fund's Survey of Western Palestine described Lod as "A small town, standing among enclosure of prickly pear, and having fine olive groves around it, especially to the south. The minaret of the mosque is a very conspicuous object over the whole of the plain. The inhabitants are principally Moslim, though the place is the seat of a Greek bishop resident of Jerusalem. The Crusading church has lately been restored, and is used by the Greeks. Wells are found in the gardens...." From 1918, Lydda was under the administration of the British Mandate in Palestine, as per a League of Nations decree that followed the Great War. During the Second World War, the British set up supply posts in and around Lydda and its railway station, also building an airport that was renamed Ben Gurion Airport after the death of Israel's first prime minister in 1973. At the time of the 1922 census of Palestine, Lydda had a population of 8,103 inhabitants (7,166 Muslims, 926 Christians, and 11 Jews), the Christians were 921 Orthodox, 4 Roman Catholics and 1 Melkite. This had increased by the 1931 census to 11,250 (10,002 Muslims, 1,210 Christians, 28 Jews, and 10 Bahai), in a total of 2475 residential houses. In 1938, Lydda had a population of 12,750. In 1945, Lydda had a population of 16,780 (14,910 Muslims, 1,840 Christians, 20 Jews and 10 "other"). Until 1948, Lydda was an Arab town with a population of around 20,000—18,500 Muslims and 1,500 Christians. In 1947, the United Nations proposed dividing Mandatory Palestine into two states, one Jewish state and one Arab; Lydda was to form part of the proposed Arab state. In the ensuing war, Israel captured Arab towns outside the area the UN had allotted it, including Lydda. In December 1947, thirteen Jewish passengers in a seven-car convoy to Ben Shemen Youth Village were ambushed and murdered.In a separate incident, three Jewish youths, two men and a woman were captured, then raped and murdered in a neighbouring village. Their bodies were paraded in Lydda’s principal street. The Israel Defense Forces entered Lydda on 11 July 1948. The following day, under the impression that it was under attack, the 3rd Battalion was ordered to shoot anyone "seen on the streets". According to Israel, 250 Arabs were killed. Other estimates are higher: Arab historian Aref al Aref estimated 400, and Nimr al Khatib 1,700. In 1948, the population rose to 50,000 during the Nakba, as Arab refugees fleeing other areas made their way there. A key event was the Palestinian expulsion from Lydda and Ramle, with the expulsion of 50,000-70,000 Palestinians from Lydda and Ramle by the Israel Defense Forces. All but 700 to 1,056 were expelled by order of the Israeli high command, and forced to walk 17 km (10+1⁄2 mi) to the Jordanian Arab Legion lines. Estimates of those who died from exhaustion and dehydration vary from a handful to 355. The town was subsequently sacked by the Israeli army. Some scholars, including Ilan Pappé, characterize this as ethnic cleansing. The few hundred Arabs who remained in the city were soon outnumbered by the influx of Jews who immigrated to Lod from August 1948 onward, most of them from Arab countries. As a result, Lod became a predominantly Jewish town. After the establishment of the state, the biblical name Lod was readopted. The Jewish immigrants who settled Lod came in waves, first from Morocco and Tunisia, later from Ethiopia, and then from the former Soviet Union. Since 2008, many urban development projects have been undertaken to improve the image of the city. Upscale neighbourhoods have been built, among them Ganei Ya'ar and Ahisemah, expanding the city to the east. According to a 2010 report in the Economist, a three-meter-high wall was built between Jewish and Arab neighbourhoods and construction in Jewish areas was given priority over construction in Arab neighborhoods. The newspaper says that violent crime in the Arab sector revolves mainly around family feuds over turf and honour crimes. In 2010, the Lod Community Foundation organised an event for representatives of bicultural youth movements, volunteer aid organisations, educational start-ups, businessmen, sports organizations, and conservationists working on programmes to better the city. In the 2021 Israel–Palestine crisis, a state of emergency was declared in Lod after Arab rioting led to the death of an Israeli Jew. The Mayor of Lod, Yair Revivio, urged Prime Minister of Israel Benjamin Netanyahu to deploy Israel Border Police to restore order in the city. This was the first time since 1966 that Israel had declared this kind of emergency lockdown. International media noted that both Jewish and Palestinian mobs were active in Lod, but the "crackdown came for one side" only. Demographics In the 19th century and until the Lydda Death March, Lod was an exclusively Muslim-Christian town, with an estimated 6,850 inhabitants, of whom approximately 2,000 (29%) were Christian. According to the Israel Central Bureau of Statistics (CBS), the population of Lod in 2010 was 69,500 people. According to the 2019 census, the population of Lod was 77,223, of which 53,581 people, comprising 69.4% of the city's population, were classified as "Jews and Others", and 23,642 people, comprising 30.6% as "Arab". Education According to CBS, 38 schools and 13,188 pupils are in the city. They are spread out as 26 elementary schools and 8,325 elementary school pupils, and 13 high schools and 4,863 high school pupils. About 52.5% of 12th-grade pupils were entitled to a matriculation certificate in 2001.[citation needed] Economy The airport and related industries are a major source of employment for the residents of Lod. Other important factories in the city are the communication equipment company "Talard", "Cafe-Co" - a subsidiary of the Strauss Group and "Kashev" - the computer center of Bank Leumi. A Jewish Agency Absorption Centre is also located in Lod. According to CBS figures for 2000, 23,032 people were salaried workers and 1,405 were self-employed. The mean monthly wage for a salaried worker was NIS 4,754, a real change of 2.9% over the course of 2000. Salaried men had a mean monthly wage of NIS 5,821 (a real change of 1.4%) versus NIS 3,547 for women (a real change of 4.6%). The mean income for the self-employed was NIS 4,991. About 1,275 people were receiving unemployment benefits and 7,145 were receiving an income supplement. Art and culture In 2009-2010, Dor Guez held an exhibit, Georgeopolis, at the Petach Tikva art museum that focuses on Lod. Archaeology A well-preserved mosaic floor dating to the Roman period was excavated in 1996 as part of a salvage dig conducted on behalf of the Israel Antiquities Authority and the Municipality of Lod, prior to widening HeHalutz Street. According to Jacob Fisch, executive director of the Friends of the Israel Antiquities Authority, a worker at the construction site noticed the tail of a tiger and halted work. The mosaic was initially covered over with soil at the conclusion of the excavation for lack of funds to conserve and develop the site. The mosaic is now part of the Lod Mosaic Archaeological Center. The floor, with its colorful display of birds, fish, exotic animals and merchant ships, is believed to have been commissioned by a wealthy resident of the city for his private home. The Lod Community Archaeology Program, which operates in ten Lod schools, five Jewish and five Israeli Arab, combines archaeological studies with participation in digs in Lod. Sports The city's major football club, Hapoel Bnei Lod, plays in Liga Leumit (the second division). Its home is at the Lod Municipal Stadium. The club was formed by a merger of Bnei Lod and Rakevet Lod in the 1980s. Two other clubs in the city play in the regional leagues: Hapoel MS Ortodoxim Lod in Liga Bet and Maccabi Lod in Liga Gimel. Hapoel Lod played in the top division during the 1960s and 1980s, and won the State Cup in 1984. The club folded in 2002. A new club, Hapoel Maxim Lod (named after former mayor Maxim Levy) was established soon after, but folded in 2007. Notable people Twin towns-sister cities Lod is twinned with: See also References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_note-pmid19834483-35] | [TOKENS: 6011] |
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Aftermath_of_World_War_II] | [TOKENS: 11046] |
Contents Aftermath of World War II The aftermath of World War II saw the rise of two global superpowers, the United States and the Soviet Union. The aftermath of World War II was also defined by the rising threat of nuclear warfare, the creation and implementation of the United Nations as an intergovernmental organization, and the decolonization of Asia, Oceania, South America and Africa by European and East Asian powers, most notably by the United Kingdom, France, and Japan. Once allies during World War II, the U.S. and the Soviet Union became competitors on the world stage and engaged in the Cold War, so called because it never resulted in overt, declared total war between the two powers. It was instead characterized by espionage, political subversion and proxy wars. Western Europe was rebuilt through the American Marshall Plan, whereas Central and Eastern Europe fell under the Soviet sphere of influence and eventually behind an "Iron Curtain". Europe was divided into a U.S.-led Western Bloc and a Soviet-led Eastern Bloc. Internationally, alliances with the two blocs gradually shifted, with some nations trying to stay out of the Cold War through the Non-Aligned Movement. The Cold War also saw a nuclear arms race between the two superpowers, and part of the reason that the Cold War never became a "hot" war was that the Soviet Union and the United States had nuclear deterrents against each other, leading to a mutually assured destruction standoff. As a consequence of the war, the Allies created the United Nations, an organization for international cooperation and diplomacy, similar to the League of Nations. Members of the United Nations agreed to outlaw wars of aggression in an attempt to avoid a third world war. The devastated great powers of Western Europe formed the European Coal and Steel Community, which later evolved into the European Economic Community and ultimately into the current European Union. This effort primarily began as an attempt to avoid another war between West Germany and France through economic cooperation and integration, and a common market for important natural resources. The end of the war opened the way for decolonization, as independence was granted to India and Pakistan (from the United Kingdom), Vietnam, Laos, Cambodia, French India and Vanuatu (from France), Indonesia (from the Netherlands), the Philippines (from the U.S.), and several Arab nations from specific Mandates granted to European states by the defunct League of Nations. The State of Israel was also established following the disestablishment of the British-ruled Mandatory Palestine and the 1948 Palestine War. Nations in Sub-Saharan Africa achieved independence in the 1950s to 1970s. The aftermath of World War II saw the rise of communist influence in East and Southeast Asia. The People's Republic of China was founded after the Chinese Communist Party (CCP) emerged victorious from the Chinese Civil War in 1949, and the First Indochina War was fought between the Viet Minh government and France after the Japanese retreat. The Korean War led to the division of the Korean Peninsula between the communist North and the Western-aligned South. Immediate effects of World War II At the end of the war in Europe, tens of millions of people had been killed and even more were displaced, European economies had collapsed, and much of Europe's industrial infrastructure had been destroyed. In response, in 1947 U.S. Secretary of State George Marshall devised the "European Recovery Program", which became known as the Marshall Plan. Under the plan, from 1948 to 1952 the United States government allocated US$13 billion (US$140 billion in 2024 dollars) for the reconstruction of affected countries in Western Europe. By the end of the war, the economy of the United Kingdom was one of severe privation, as a significant portion of its national wealth had been consumed by the war effort. Until the introduction in 1941 of Lend-Lease aid from the US, the UK had been spending its assets to purchase American equipment including aircraft and ships—over £437 million (equivalent to some £27 billion in 2023) on aircraft alone. Lend-Lease came just before its reserves were exhausted. Britain had placed 55% of its total labour force into war production. In the spring of 1945, after the final defeat of Germany, the Labour Party withdrew from the wartime coalition government, to oust Winston Churchill, forcing a general election. Following a landslide victory, Labour held more than 60% of the seats in the House of Commons and formed a new government on 26 July 1945 under Clement Attlee, who had been Deputy Prime Minister in the coalition government. Britain's war debt was described by some in the American administration as a "millstone round the neck of the British economy". Although there were suggestions for an international conference to tackle the issue, in August 1945 the U.S. announced unexpectedly that the Lend-Lease programme was to end immediately. The abrupt withdrawal of American Lend-Lease support to Britain on 2 September 1945 dealt a severe blow to the plans of the new government. It was only with the completion of the Anglo-American loan by the United States to Great Britain on 15 July 1946 that some measure of economic stability was restored. However, the loan was made primarily to support British overseas expenditure in the immediate post-war years and not to implement the Labour government's policies for domestic welfare reforms and the nationalisation of key industries. Although the loan was agreed on reasonable terms, its conditions included what proved to be damaging fiscal conditions for sterling. From 1946 to 1948, the UK introduced bread rationing, which it had never done during the war. The Soviet Union suffered enormous losses in the war against Germany. The Soviet population decreased by about 27 million during the war; of these, 8.7 million were combat deaths. The 19 million non-combat deaths had a variety of causes: starvation in the siege of Leningrad; conditions in German prisons and concentration camps; mass shootings of civilians; harsh labour in German industry; famine and disease; conditions in Soviet camps; and service in German or German-controlled military units fighting the Soviet Union. Soviet ex-POWs and civilians repatriated from abroad were suspected of having been Nazi collaborators, and 226,127 of them were sent to forced labour camps after scrutiny by Soviet intelligence, NKVD. Many ex-POWs and young civilians were also conscripted to serve in the Red Army. Others worked in labour battalions to rebuild infrastructure destroyed during the war. The economy had been devastated. Roughly a quarter of the Soviet Union's capital resources were destroyed, and industrial and agricultural output in 1945 fell far short of pre-war levels. To help rebuild the country, the Soviet government obtained limited credits from Britain and Sweden; it refused assistance offered by the United States under the Marshall Plan. Instead, the Soviet Union coerced Soviet-occupied Central and Eastern Europe to supply machinery and raw materials. Germany and former Nazi satellites made reparations to the Soviet Union. The reconstruction programme emphasized heavy industry to the detriment of agriculture and consumer goods. By 1953, steel production was twice its 1940 level, but the production of many consumer goods and foodstuffs was lower than it had been in the late 1920s. The immediate post-war period in Europe was dominated by the Soviet Union annexing, or converting into Soviet Socialist Republics, all the countries invaded and annexed by the Red Army driving the Germans out of central and eastern Europe. New satellite states were set up by the Soviets in Poland, Bulgaria, Hungary,[page needed] Czechoslovakia, Romania, Albania, and East Germany; the last of these was created from the Soviet zone of occupation in Germany. Yugoslavia emerged as an independent Communist state allied but not aligned with the Soviet Union, owing to the independent nature of the military victory of the Partisans of Josip Broz Tito during World War II in Yugoslavia. The Allies established the Far Eastern Commission and Allied Council for Japan to administer their occupation of that country while the establishment Allied Control Council, administered occupied Germany. Following the Potsdam Conference agreements, the Soviet Union occupied and subsequently annexed the strategic island of Sakhalin. In the east, the Sudetenland reverted to Czechoslovakia following the European Advisory Commission's decision to delimit German territory to be the territory it held on 31 December 1937. Close to one-quarter of pre-war (1937) Nazi Germany was de facto annexed by the Allies; roughly 10 million Germans were either expelled from this territory or not permitted to return to it if they had fled during the war. The remainder of Germany was partitioned into four zones of occupation, coordinated by the Allied Control Council. The Saar was detached and put into economic union with France in 1947. In 1949, the Federal Republic of Germany was created out of the Western zones. The Soviet zone became the German Democratic Republic. Germany paid reparations to the United Kingdom, France, and the Soviet Union, mainly in the form of dismantled factories, forced labour, and coal. The German standard of living was to be reduced to its 1932 level. Beginning immediately after the German surrender and continuing for the next two years, the U.S. and Britain pursued an "intellectual reparations" programme to harvest all technological and scientific know-how as well as all patents in Germany. The value of these amounted to around US$10 billion (US$120 billion in 2024 dollars). In accordance with the Paris Peace Treaties, 1947, reparations were also assessed from the countries of Italy, Romania, Hungary, Bulgaria, and Finland. US policy in post-war Germany from April 1945 until July 1947 had been that no help should be given to the Germans in rebuilding their nation, save for the minimum required to mitigate starvation. The Allies' immediate post-war "industrial disarmament" plan for Germany had been to destroy Germany's capability to wage war by complete or partial de-industrialization. The first industrial plan for Germany signed in 1946, required the destruction of 1,500 manufacturing plants to lower German heavy industry output to roughly 50% of its 1938 level. The dismantling of the West German industry ended in 1951. By 1950, equipment had been removed from 706 manufacturing plants, and steel production capacity had been reduced by 6.7 million tons. After lobbying by the Joint Chiefs of Staff and Generals Lucius D. Clay and George Marshall, the Truman administration accepted that economic recovery in Europe could not go forward without the reconstruction of the German industrial base on which it had previously been dependent. In July 1947, President Truman rescinded on "national security grounds" the directive that had ordered the U.S. occupation forces to "take no steps looking toward the economic rehabilitation of Germany." A new directive recognized that "[a]n orderly, prosperous Europe requires the economic contributions of a stable and productive Germany." From mid-1946 onwards Germany received U.S. government aid through the GARIOA programme. From 1948 onwards West Germany also became a minor beneficiary of the Marshall Plan. Volunteer organizations had initially been forbidden to send food, but in early 1946 the Council of Relief Agencies Licensed to Operate in Germany was founded. The prohibition against sending CARE Packages to individuals in Germany was rescinded on 5 June 1946. Following the German surrender, the International Red Cross was prohibited from providing aid such as food or visiting POW camps for Germans inside Germany. However, after making approaches to the Allies in the autumn of 1945 it was allowed to investigate the camps in the UK and French occupation zones of Germany, as well as to provide relief to the prisoners held there. On 4 February 1946, the Red Cross was also permitted to visit and assist prisoners in the U.S. occupation zone of Germany, although only with very small quantities of food. The Red Cross petitioned successfully for improvements to be made in the living conditions of German POWs. The German people as a whole, especially its youth, were traumatized psychologically by the previous decade of Nazi rule, with major cities and infrastructure destroyed by Allied bombardments. This trauma was multifaceted, as it permeated all levels of society, by means of the systematic Nazification of the country with the strategic creation of the Reich Ministry of Public Enlightenment and Propaganda which took over the media and all institutions, and put in place the systematic indoctrination of the very young via the creation of the Hitler Youth, the Deutsches Jungvolk, the League of German Girls and the Jungmädelbund. At the end of the war, major cities were devastated, food shortages ensued, and a wave of denazification occurred throughout occupied Germany. As France was liberated from German occupation, an épuration (purge) of real and suspected Nazi collaborators began. At first, this was undertaken in an extralegal manner by the French Resistance (called the épuration sauvage, "wild purge"). French women who had had romantic liaisons with German soldiers were publicly humiliated and had their heads shaved. There was also a wave of summary executions estimated to have killed about 10,000 people. When the Provisional Government of the French Republic established control, the Épuration légale ("legal purge") began. There were no international war crimes trials for French collaborators, who were tried in the domestic courts. Approximately 300,000 cases were investigated; 120,000 people were given various sentences including 6,763 death sentences (of which only 791 were carried out). Most convicts were given amnesty a few years later. The aftermath of World War II left Italy with an anger against the monarchy for its endorsement of the Fascist regime for the previous twenty years. These frustrations contributed to a revival of the Italian republican movement. In the 1946 Italian constitutional referendum, held on 2 June, a day celebrated since as Festa della Repubblica, the Italian monarchy was abolished, having been associated with the deprivations of the war and the Fascist rule, especially in the North, and Italy became a republic. This was the first time that Italian women voted at the national level, and the second time overall considering the local elections that were held a few months earlier in some cities. King Victor Emmanuel III's son, King Umberto II, was forced to abdicate and exiled. The Republican Constitution was approved on 1 January 1948, resulting from the work of a Constituent Assembly formed by the representatives of all the anti-fascist forces that contributed to the defeat of Nazi and Fascist forces during the liberation of Italy. Unlike in Germany and Japan, no war crimes tribunals were held against Italian military and political leaders, though the Italian resistance summarily executed some of them (such as Mussolini) at the end of the war; the Togliatti amnesty, taking its name from the Communist Party secretary at the time, pardoned all wartime common and political crimes in 1946. The 1947, Treaty of Peace with Italy spelled the end of the Italian colonial empire, along with other border revisions, like the transfer of the Italian Islands of the Aegean to the Kingdom of Greece and the transfer to France of Briga and Tenda, as well than to minor revisions of the Franco-Italian border. Moreover, under the Treaty of Peace with Italy, Istria, Kvarner, most of the Julian March as well as the Dalmatian city of Zara was annexed by Yugoslavia causing the Istrian–Dalmatian exodus, which led to the emigration of between 230,000 and 350,000 of local ethnic Italians (Istrian Italians and Dalmatian Italians), the others being ethnic Slovenians, ethnic Croatians, and ethnic Istro-Romanians, choosing to maintain Italian citizenship, towards Italy, and in smaller numbers, towards the Americas, Australia and South Africa. The 1947 Treaty of Peace compelled Italy to pay $360 million (US dollars at 1938 prices) in war reparations: $125 million to Yugoslavia, $105 million to Greece, $100 million to the Soviet Union, $25 million to Ethiopia and $5 million to Albania. In 1954 the Free Territory of Trieste, an independent territory between northern Italy and Yugoslavia under direct responsibility of the United Nations Security Council, was divided between the two states, Italy and Yugoslavia. The Italian border that applies today has existed since 1975, when Trieste was formally re-annexed to Italy after the Treaty of Osimo. In 1950, Italian Somaliland was made a United Nations Trust Territory under Italian administration until 1 July 1960. The Federal State of Austria had been annexed by Germany in 1938 (Anschluss, this union was banned by the Treaty of Versailles). Austria (called Ostmark by the Germans) was separated from Germany and divided into four zones of occupation. With the Austrian State Treaty, these zones reunited in 1955 to become the Republic of Austria. Following the war, the Allies rescinded the Empire of Japan's pre-war annexations such as Manchuria, and Korea became militarily occupied by the United States in the south and by the Soviet Union in the north. The Philippines and Guam were returned to the United States. Burma, Malaya, and Singapore were returned to Britain and Indochina back to France. The Dutch East Indies was to be handed back to the Dutch but was resisted leading to the Indonesian war for independence. At the Yalta Conference, U.S. president Franklin D. Roosevelt had secretly traded the Japanese Kurils and south Sakhalin to the Soviet Union in return for Soviet entry into the war with Japan. The Soviet Union annexed the Kuril Islands, provoking the Kuril Islands dispute, which is ongoing, as Russia continues to occupy the islands. Hundreds of thousands of Japanese were forced to relocate to the Japanese main islands. Okinawa became a main U.S. staging point. The U.S. covered large areas of it with military bases and continued to occupy it until 1972, years after the end of the occupation of the main islands. The bases remain. To skirt the Geneva Convention, the Allies classified many Japanese soldiers as Japanese Surrendered Personnel (JSP) instead of POWs and used them as forced labour until 1947. The UK, France, and the Netherlands used JSP to support their military operations in the region after World War II. General Douglas MacArthur established the International Military Tribunal for the Far East. The Allies collected reparations from Japan. To further remove Japan as a potential future military threat, the Far Eastern Commission decided to de-industrialize Japan, to reduce the Japanese standard of living to what prevailed between 1930 and 1934. In the end, the de-industrialisation programme in Japan was implemented to a lesser degree than the one in Germany. Japan received emergency aid from GARIOA, as did Germany. In early 1946, the Licensed Agencies for Relief in Asia were formed and permitted to supply Japanese with food and clothes. In April 1948 the Johnston Committee Report recommended that the economy of Japan should be reconstructed due to the high cost to US taxpayers of continuous emergency aid. Survivors of the atomic bombings of Hiroshima and Nagasaki, known as hibakusha (被爆者), were ostracized by Japanese society. Japan provided no special assistance to these people until 1952. By the 65th anniversary of the bombings, total casualties from the initial attack and later deaths reached about 270,000 in Hiroshima and 150,000 in Nagasaki. About 230,000 hibakusha were still alive as of 2010[update], and about 2,200 were suffering from radiation-caused illnesses as of 2007[update]. In the Winter War of 1939–1940, the Soviet Union invaded neutral Finland and annexed some of its territory. From 1941 until 1944, Finland aligned itself with Nazi Germany in a failed effort to regain lost territories from the Soviets. Finland retained its independence following the war but remained subject to Soviet-imposed constraints in its domestic affairs. In 1940 the Soviet Union invaded and annexed the neutral Baltic states, Estonia, Latvia, and Lithuania. In June 1941, the Soviet governments of the Baltic states carried out mass deportations of "enemies of the people"; as a result, many treated the invading Nazis as liberators when they invaded only a week later. The Atlantic Charter promised self-determination to people deprived of it during the war. The British Prime Minister, Winston Churchill, argued for a weaker interpretation of the Charter to permit the Soviet Union to continue to control the Baltic states. In March 1944 the US accepted Churchill's view that the Atlantic Charter did not apply to the Baltic states. With the return of Soviet troops at the end of the war, the Forest Brothers mounted a guerrilla war. This continued until the mid-1950s. An estimated one million military and civilian Filipinos were killed from all causes; of these 131,028 were listed as killed in seventy-two war crime events. According to a United States analysis released years after the war, U.S. casualties were 10,380 dead and 36,550 wounded; Japanese dead were 255,795. Population displacement As a result of the new borders drawn by the victorious nations, large populations suddenly found themselves in hostile territory. The Soviet Union took over areas formerly controlled by Germany, Finland, Poland, and Japan. Poland lost the Kresy region (about half of its pre-war territory) and received most of Germany east of the Oder–Neisse line, including the industrial regions of Silesia. The German state of the Saar was temporarily a protectorate of France but later returned to German administration. As set forth at Potsdam, approximately 12 million people were expelled from Germany, including seven million from Germany proper, and three million from the Sudetenland. During the war, the United States government interned approximately 110,000 Japanese Americans and Japanese who lived along the Pacific coast of the United States in the wake of Imperial Japan's attack on Pearl Harbor. Canada interned approximately 22,000 Japanese Canadians, 14,000 of whom were born in Canada. After the war, some internees chose to return to Japan, while most remained in North America. The Soviet Union expelled at least 2 million Poles from the east of the new border approximating the Curzon Line. This estimate is uncertain as neither the Polish Communist government nor the Soviet government kept track of the number of expelled people. The number of Polish citizens inhabiting Polish borderlands (Kresy region) was about 13 million before World War II broke out according to official Polish statistics. Polish citizens killed in the war that originated from the Polish borderlands territory (killed by either the German Nazi regime or the Soviet regime, or expelled to distant parts of Siberia) were accounted as Russian, Ukrainian, or Belarusian casualties of war in official Soviet historiography. This fact imposes additional difficulties in making the correct estimation of the number of Polish citizens forcibly transferred after the war. The border change also reversed the results of the 1919–1920 Polish–Soviet War. Former Polish cities such as Lwów came under the control of the Ukrainian Soviet Socialist Republic. Additionally, the Soviet Union transferred more than two million people within their borders; these included Germans, Finns, Crimean Tatars, and Chechens. Rape during occupation and liberation As Soviet troops marched across the Balkans, they committed rapes and robberies in Romania, Hungary, Czechoslovakia and Yugoslavia. The population of Bulgaria was largely spared of this treatment, possibly due to a sense of ethnic kinship or to the leadership of Marshal Fyodor Tolbukhin. The population of Germany was treated significantly worse. Rape and murder of German civilians was as bad as, and sometimes worse than, Nazi propaganda had anticipated. Political officers encouraged Soviet troops to seek revenge and terrorise the German population. On "the basis of Hochrechnungen (projections or estimations)", "1.9 million German women altogether were raped at the end of the war by Red Army soldiers." About one-third of all German women in Berlin were raped by Soviet forces. A substantial minority were raped multiple times. In Berlin, contemporary hospital records indicate between 95,000 and 130,000 women were raped by Soviet troops. About 10,000 of these women died, mostly by suicide. Over 4.5 million Germans fled towards the West. The Soviets initially had no rules against their troops "fraternising" with German women, but by 1947 they started to isolate their troops from the German population in an attempt to stop rape and robbery by the troops. Not all Soviet soldiers participated in these activities. Foreign reports of Soviet brutality were denounced[by whom?] as false. Rape, robbery, and murder were blamed on German bandits impersonating Soviet soldiers. Some justified Soviet brutality towards German civilians based on previous brutality of German troops toward Russian civilians. Until the reunification of Germany, East German histories virtually ignored the actions of Soviet troops, and Russian histories still tend to do so. Reports of mass rapes by Soviet troops were often dismissed as anti-Communist propaganda or the normal byproduct of war. Rapes also occurred under other Allied forces in Europe, though the majority were committed by Soviet troops. In a letter to the editor of Time published in September 1945, a United States Army sergeant wrote, "Our own Army and the British Army along with ours have done their share of looting and raping ... This offensive attitude among our troops is not at all general, but the percentage is large enough to have given our Army a pretty black name, and we too are considered an army of rapists." Robert Lilly's analysis of military records led him to conclude about 14,000 rapes occurred in Britain, France, and Germany at the hands of U.S. soldiers between 1942 and 1945. Lilly assumed that only 5% of rapes by American soldiers were reported, making 17,000 GI rapes a possibility, while analysts estimate that 50% of (ordinary peacetime) rapes are reported. Supporting Lilly's lower figure is the "crucial difference" that for World War II military rapes "it was the commanding officer, not the victim, who brought charges". According to German historian Miriam Gebhardt, as many as 190,000 women were raped by U.S. soldiers in Germany. German soldiers left many war children behind in nations such as France and Denmark, which were occupied for an extended period. After the war, the children and their mothers often suffered recriminations. In Norway, the "Tyskerunger" (German-kids) suffered greatly. During the Italian campaign, the Goumiers, French Moroccan colonial troops attached to the French Expeditionary Forces, have been accused of committing rape and murder against the Italian peasant communities, mostly targeting civilian women and girls, as well as a few men and boys.[unreliable source?] In Italy the victims of these acts were described as Marocchinate meaning literally "Moroccaned" (or people who have been subjected to acts committed by Moroccans). According to Italian victims associations, a total of more than 7,000 civilians, including children, were raped by Goumiers. In the first few weeks of the American military occupation of Japan, rape and other violent crime was widespread in naval ports like Yokohama and Yokosuka but declined shortly afterward. There were 1,336 reported rapes during the first 10 days of the occupation of Kanagawa prefecture. Historian Toshiyuki Tanaka relates that in Yokohama, the capital of the prefecture, there were 119 known rapes in September 1945.[page needed] Historians Eiji Takemae and Robert Ricketts state that "When U.S. paratroopers landed in Sapporo, an orgy of looting, sexual violence, and drunken brawling ensued. Gang rapes and other sex atrocities were not infrequent" and some of the rape victims committed suicide. General Robert L. Eichelberger, the commander of the U.S. Eighth Army, recorded that in the one instance when the Japanese formed a self-help vigilante guard to protect women from off-duty GIs, the Eighth Army ordered armored vehicles in the battle array into the streets and arrested the leaders, and the leaders received long prison terms. According to Takemae and Ricketts, members of the British Commonwealth Occupation Force (BCOF) were also involved in rapes: A former prostitute recalled that as soon as Australian troops arrived in Kure in early 1946, they "dragged young women into their jeeps, took them to the mountain, and then raped them. I heard them screaming for help nearly every night". Such behavior was commonplace, but news of criminal activity by Occupation forces was quickly suppressed. Rape committed by U.S. soldiers occupying Okinawa was also a notable phenomenon. Okinawan historian Oshiro Masayasu (former director of the Okinawa Prefectural Historical Archives) writes: Soon after the U.S. marines landed, all the women of a village on Motobu Peninsula fell into the hands of American soldiers. At the time, there were only women, children and old people in the village, as all the young men had been mobilized for the war. Soon after landing, the marines "mopped up" the entire village, but found no signs of Japanese forces. Taking advantage of the situation, they started "hunting for women" in broad daylight and those who were hiding in the village or nearby air raid shelters were dragged out one after another. According to Toshiyuki Tanaka, 76 cases of rape or rape-murder were reported during the first five years of the American occupation of Okinawa. However, he claims this is probably not the true figure, as most cases were unreported. During World War II, the Japanese military established brothels filled with "comfort women", a euphemism for the 200,000 girls and women who were forced into sexual slavery for Japanese soldiers. In Confucian nations like Korea and China, where premarital sex is considered shameful, the subject of the "comfort women" was ignored for decades after 1945 as the victims were considered pariahs. Dutch comfort women brought a successful case before the Batavia Military Tribunal in 1948. Post-war tensions The alliance between the Western Allies and the Soviet Union began to deteriorate even before the war was over, when Stalin, Roosevelt, and Churchill exchanged a heated correspondence over whether the Polish government-in-exile, backed by Roosevelt and Churchill, or the Provisional Government, backed by Stalin, should be recognised. Stalin won. Many allied leaders felt that war between the United States and the Soviet Union was likely. On 19 May 1945, the American Under-Secretary of State Joseph Grew went so far as to say that it was inevitable. On 5 March 1946, in his "Sinews of Peace" (Iron Curtain) speech at Westminster College in Fulton, Missouri, Winston Churchill said "a shadow" had fallen over Europe. He described Stalin as having dropped an "Iron Curtain" between East and West. Stalin responded by charging that co-existence between communist countries and the West was impossible. In mid-1948 the Soviet Union imposed a blockade on the Western zone of occupation in Berlin. Due to the rising tension in Europe and concerns over further Soviet expansion, American planners came up with a contingency plan code-named Operation Dropshot in 1949. It considered possible nuclear and conventional war with the Soviet Union and its allies to counter a Soviet takeover of Western Europe, the Near East, and parts of Eastern Asia that they anticipated would begin around 1957. In response, the U.S. would saturate the Soviet Union with atomic and high-explosive bombs, and then invade and occupy the country. In later years, to reduce military expenditures while countering Soviet conventional strength, President Dwight Eisenhower would adopt a strategy of massive retaliation, relying on the threat of a U.S. nuclear strike to prevent non-nuclear incursions by the Soviet Union in Europe and elsewhere. The approach entailed a major buildup of U.S. nuclear forces and a corresponding reduction in America's non-nuclear ground and naval strength. The Soviet Union viewed these developments as "atomic blackmail". In Greece, civil war broke out in 1946 between Anglo-American-supported royalist forces and communist-led forces, with the royalist forces emerging as the victors. The U.S. launched a massive programme of military and economic aid to Greece and to neighbouring Turkey, arising from a fear that the Soviet Union stood on the verge of breaking through the NATO defence line to the oil-rich Middle East. On 12 March 1947, to gain Congressional support for the aid, President Truman described the aid as promoting democracy in defence of the "Free World", a principle that became known as the Truman Doctrine. The U.S. sought to promote an economically strong and politically united Western Europe to counter the threat posed by the Soviet Union. This was done openly using tools such as the European Recovery Program, which encouraged European economic integration. The International Authority for the Ruhr, designed to keep German industry down and controlled, evolved into the European Coal and Steel Community, a founding pillar of the European Union. The United States also worked covertly to promote European integration, for example using the American Committee on United Europe to funnel funds to European federalist movements. To ensure that Western Europe could withstand the Soviet military threat, the Western European Union was founded in 1948 and NATO in 1949. The first NATO Secretary General, Lord Ismay, famously stated the organisation's goal was "to keep the Russians out, the Americans in, and the Germans down". However, without the manpower and industrial output of West Germany no conventional defence of Western Europe had any hope of succeeding. To remedy this, in 1950 the US sought to promote the European Defence Community, which would have included a rearmed West Germany. The attempt was dashed when the French Parliament rejected it. On 9 May 1955, West Germany was instead admitted to NATO; the immediate result was the creation of the Warsaw Pact five days later. The Cold War also saw the creation of propaganda and espionage organisations such as Radio Free Europe, the Information Research Department, the Gehlen Organization, the Central Intelligence Agency, the Special Activities Division, and the Ministry for State Security, as well as the radicalization and proliferation of numerous far-left and far-right terrorist organizations in Western European countries (Italy, France, West Germany, Belgium, Francoist Spain, and the Netherlands), with spillovers in Northern and Southeastern Europe. In Asia, the surrender of Japanese forces was complicated by the split between East and West as well as by the movement toward national self-determination in European colonial territories. Decisions to decolonize British India led to an agreement to partition the country along religious lines into two independent dominions: India and Pakistan. The partition resulted in communal violence and massive displacements of the population. It is often described as the largest mass human migration and one of the largest refugee crises in history. As agreed to at the Yalta Conference, the Soviet Union declared war on Japan. Soviet forces invaded Manchuria which led to the collapse of the Manchukuo and expulsion of all Japanese settlers from the puppet state. The Soviet Union dismantled the industrial base in Manchuria that the Japanese had built up and it subsequently became a base for the Communist Chinese forces due to the area being under Soviet occupation. Following the end of the war, the Kuomintang (KMT) party (led by generalissimo Chiang Kai-shek) and the Communist Chinese forces resumed fighting each other, which they had temporarily suspended in order to fight Japan. The fight against the Japanese occupiers had strengthened popular support among the Chinese people for the Communist forces while it weakened the KMT, which depleted its strength fighting them. Full-scale war between the KMT and CCF broke out in June 1946. Despite U.S. support for the Kuomintang, Communist forces ultimately prevailed and they established the People's Republic of China (PRC) on the mainland. The KMT forces retreated to the island of Taiwan in 1949 where they established the Republic of China (ROC). With the Communist victory in the civil war, the Soviet Union gave up its claim to military bases in China that were given to it by its Western Allies at the end of World War II.[citation needed] While large scale hostilities largely ceased by 1950, intermittent clashes occurred between the two from 1950 to 1979. Taiwan unilaterally declared the civil war over in 1991, but no formal peace treaty or truce has been signed and the PRC continues to officially see Taiwan as a breakaway province that rightfully belongs to it.[citation needed] The outbreak of the Korean War a few months after the conclusion of the Chinese Civil War and continued U.S. support for the KMT were the main reasons that prevented the PRC from invading Taiwan.[citation needed] At the Yalta Conference, the Allies agreed that an undivided post-war Korea would be placed under four-power multinational trusteeship. After Japan's surrender, this agreement was modified to a joint Soviet-American occupation of Korea. The agreement was that Korea would be divided and occupied by the Soviets from the north and the Americans from the south. Korea, formerly under Japanese rule, and which had been partially occupied by the Red Army following the Soviet Union's entry into the war against Japan, was divided at the 38th parallel on the orders of the U.S. Department of War. A U.S. military government in southern Korea was established in the capital city of Seoul. The American military commander, Lt. Gen. John R. Hodge, enlisted many former Japanese administrative officials to serve in this government. North of the military line, the Soviets administered the disarming and demobilisation of repatriated Korean nationalist guerrillas who had fought on the side of Chinese nationalists against the Japanese in Manchuria during World War II. Simultaneously, the Soviets enabled a build-up of heavy armaments to pro-communist forces in the north. The military line became a political line in 1948, when separate republics emerged on both sides of the 38th parallel, each republic claiming to be the legitimate government of Korea. It culminated in the north invading the south, start of the Korean War two years later. Labour and civil unrest broke out in the British colony of Malaya in 1946. A state of emergency was declared by the colonial authorities in 1948 with the outbreak of acts of terrorism. The situation deteriorated into a full-scale anti-colonial insurgency, or Anti-British National Liberation War as the insurgents referred to it, led by the Malayan National Liberation Army (MNLA), the military wing of the Malayan Communist Party. The Malayan Emergency would endure for the next 12 years, ending in 1960. In 1967, communist leader Chin Peng reopened hostilities, culminating in a second emergency that lasted until 1989. Events after World War II in French Indochina, which consisted of the territories of modern-day Vietnam, Laos and Cambodia, set the stage for Indochina wars. By 1941 during World War II, Japan gained full military access across Indochina and established a fragile dual colonial rule that maintained French administration while facilitating Japanese preparations for Southeast Asian operations. The communist-controlled Viet Minh Front was formed in 1941 to fight against both Japanese and French forces. Because the French colonial authorities began holding secret talks with Free France, the Japanese carried out a coup d’état on 9 March 1945. When Japan surrendered in August, this created a power vacuum, and the Viet Minh seized power during the August Revolution, declaring the independence of the Democratic Republic of Vietnam. However, the Allies (including the Soviet Union) all agreed that the area belonged to the French. From 1945 onwards, the Vietnamese were locked in civil conflicts over the destiny of their post-colonial state after the ousting of the French and the surrender of Japan. Meanwhile, Nationalist Chinese forces moved in from the north and British from the south (as the French were unable to do so immediately themselves) and then handed power to the French, a process completed by March 1946. Attempts to integrate the Democratic Republic of Vietnam with French rule failed and the Viet Minh launched their rebellion against the French rule starting the First Indochina War that same year (the Viet Minh organized common fronts to fight the French in Laos and Cambodia). The war ended in 1954 with the French withdrawal and the partition of Vietnam, which was intended to be temporary until elections could be held. The Democratic Republic of Vietnam controlled the North, while the State of Vietnam held the South. Ngo Dinh Diem, backed by the United States in his refusal to hold elections, which he claimed would be unfair due to communist rigging, established the Republic of Vietnam. Communist insurgents in the South formed the NLF under the direct guidance of North Vietnam to fight against the South Vietnamese government, a conflict that ultimately ended with North Vietnam conquering the South in April 1975. Japan invaded and occupied the Dutch East Indies during the war and replaced the colonial government with a new administration. Although the top positions were held by Japanese officers, the internment of all Dutch citizens meant that Indonesians filled many leadership and administrative positions. Following the Japanese surrender in August 1945, Indonesian nationalist leaders such as Sukarno and Mohammad Hatta declared Indonesia as independent. A four-and-a-half-year struggle followed as the Dutch tried to re-establish their rule in colony, using a significant portion of their Marshall Plan aid to this end. The Dutch were aided by British forces for the first phase of the conflict until the United Kingdom withdrew. The British also initially used 35,000 Japanese Surrendered Personnel to support their military operations in Indonesia. Although Dutch forces re-occupied most of Indonesia, an Indonesian guerrilla campaign supported by the majority of Indonesians ensured, and ultimately international opinion favoured independence. In December 1949, the Netherlands formally recognised Indonesian sovereignty.[citation needed] Wartime criminals recruited as Cold War assets British covert operations in the Baltic States, which began in 1944 against the Nazis, escalated following the war. In Operation Jungle, the Secret Intelligence Service (known as MI6) recruited and trained Estonians, Latvians, and Lithuanians for the clandestine work in the Baltic states between 1948 and 1955. Leaders of the operation included Alfons Rebane, Stasys Žymantas, and Rūdolfs Silarājs. The agents were transported under the cover of the "British Baltic Fishery Protection Service". They launched from British-occupied Germany, using a converted World War II E-boat captained and crewed by former members of the wartime German navy. British intelligence also trained and infiltrated anti-communist agents into Soviet Union from across the Finnish border, with orders to assassinate Soviet officials. In the end, counter-intelligence supplied to the KGB by Kim Philby allowed the KGB to penetrate and ultimately gain control of MI6's entire intelligence network in the Baltic states. Vietnam and the Middle East would later damage the reputation gained by the U.S. during its successes in Europe. The KGB believed that the Third World rather than Europe was the arena in which it could win the Cold War. Moscow would in later years fuel an arms buildup in Africa. In later years, African countries used as proxies in the Cold War would often become "failed states" of their own. In 2014, The New York Times reported that "In the decades after World War II, the Central Intelligence Agency (CIA) and other United States agencies employed at least a thousand Nazis as Cold War spies and informants and, as recently as the 1990s, concealed the government's ties to some still living in America, newly disclosed records and interviews show." According to Timothy Naftali, "The CIA's central concern [in recruiting former Nazi collaborators] was not so much the extent of the criminal's guilt as the likelihood that the agent's criminal past could remain a secret.": 365 When the divisions of postwar Europe began to emerge, the war crimes programmes and denazification policies of Britain and the United States were relaxed in favour of recruiting German scientists, especially nuclear and long-range rocket scientists. Many of these, prior to their capture, had worked on developing the German V-2 long-range rocket at the Baltic coast German Army Research Center Peenemünde. Western Allied occupation force officers in Germany were ordered to refuse to cooperate with the Soviets in sharing captured wartime secret weapons, the recovery for which, specifically in regards to advanced German aviation technology and personnel, the British had sent the Fedden Mission into Germany to contact its aviation technology centers and key personnel, paralleled by the United States with its own Operation Lusty aviation technology personnel and knowledge recovery program. In Operation Paperclip, beginning in 1945, the United States imported 1,600 German scientists and technicians, as part of the intellectual reparations owed to the U.S. and the UK, including about $10 billion (US$165 billion in 2025 dollars) in patents and industrial processes. In late 1945, three German rocket-scientist groups arrived in the U.S. for duty at Fort Bliss, Texas, and at White Sands Proving Grounds, New Mexico, as "War Department Special Employees". The wartime activities of some Operation Paperclip scientists would later be investigated. Arthur Rudolph left the United States in 1984, in order to not be prosecuted. Similarly, Georg Rickhey, who came to the United States under Operation Paperclip in 1946, was returned to Germany to stand trial at the Mittelbau-Dora war crimes trial in 1947. Following his acquittal, he returned to the United States in 1948 and eventually became a U.S. citizen. The Soviets began Operation Osoaviakhim in 1946. NKVD and Soviet army units effectively deported thousands of military-related technical specialists from the Soviet occupation zone of post-war Germany to the Soviet Union. The Soviets used 92 trains to transport the specialists and their families, an estimated 10,000–15,000 people. Much related equipment was also moved, the aim being to virtually transplant research and production centres, such as the relocated V-2 rocket centre at Mittelwerk Nordhausen, from Germany to the Soviet Union. Among the people moved were Helmut Gröttrup and about two hundred scientists and technicians from Mittelwerk. Personnel were also taken from AEG, BMW's Stassfurt jet propulsion group, IG Farben's Leuna chemical works, Junkers, Schott AG, Siebel, Telefunken, and Carl Zeiss AG. The operation was commanded by NKVD deputy Colonel General Serov, outside the control of the local Soviet Military Administration. The major reason for the operation was the Soviet fear of being condemned for noncompliance with Allied Control Council agreements on the liquidation of German military installations. Some Western observers thought Operation Osoaviakhim was a retaliation for the failure of the Socialist Unity Party in elections, though Osoaviakhim was clearly planned before that. Demise of the League of Nations and the founding of the United Nations As a general consequence of the war and in an effort to maintain international peace, the Allies formed the United Nations (UN), which officially came into existence on 24 October 1945. The UN replaced the defunct League of Nations (LN) as an intergovernmental organization. The LN was formally dissolved on 20 April 1946 but had in practice ceased to function in 1939, being unable to stop the outbreak of World War II. The UN inherited some of the bodies of the LN, such as the International Labour Organization. League of Nations mandate, mostly territories that had changed hands in World War I, became United Nations trust territories. South West Africa, an exception, was still governed under terms of the original mandate. As the successor body to the League, the UN still assumed a supervisory role over the territory. The Free City of Danzig, a semi-autonomous City-state that was partly overseen by the League, became part of Poland. The UN adopted The Universal Declaration of Human Rights in 1948, "as a common standard of achievement for all peoples and all nations." The Soviet Union abstained from voting on adoption of the declaration. The U.S. did not ratify the social and economic rights sections. The five major Allied powers were given permanent membership in the United Nations Security Council. The permanent members can veto any United Nations Security Council resolution, the only UN decisions that are binding according to international law. The five powers at the time of founding were: the United States of America, the United Kingdom, France, the Soviet Union and the Republic of China. The Republic of China lost the Chinese Civil War and retreated to the island of Taiwan by 1950 but continued to be a permanent member of the Council even though the de facto state in control of mainland China was the People's Republic of China (PRC). This was changed in 1971 when the PRC was given the permanent membership previously held by the Republic of China. Russia inherited the permanent membership of the Soviet Union in 1991 after the dissolution of that state. Unresolved conflicts Japanese holdouts persisted on various islands in the Pacific Theatre until at least 1974. Although all hostilities are now resolved, a peace treaty has never been signed between Japan and Russia due to the Kuril Islands dispute. Economic aftermath By the end of the war, the European economy had collapsed with some 70% of its industrial infrastructure destroyed. The property damage in the Soviet Union consisted of complete or partial destruction of 1,710 cities and towns, 70,000 villages/hamlets, and 31,850 industrial establishments. The strength of the economic recovery following the war varied throughout the world, though in general, it was quite robust, particularly in the United States. In Europe, West Germany, after having continued to decline economically during the first years of the Allied occupation, later experienced a remarkable recovery, and had by the end of the 1950s doubled production from its pre-war levels. Italy came out of the war in poor economic condition, but by the 1950s, the Italian economy was marked by stability and high growth. France rebounded quickly and enjoyed rapid economic growth and modernisation under the Monnet Plan. The UK, by contrast, was in a state of economic ruin after the war and continued to experience relative economic decline for decades to follow. The Soviet Union also experienced a rapid increase in production in the immediate post-war era. Japan experienced rapid economic growth, becoming one of the most powerful economies in the world by the 1980s. China, following the conclusion of its civil war, was essentially bankrupt. By 1953, economic restoration seemed fairly successful as production had resumed pre-war levels. Although China's growth rate mostly persisted, it was severely disrupted by the economic experiments of the Great Leap Forward, due to the resulting famine that caused the deaths of 15 to 55 million people.[citation needed] At the end of the war, the United States produced roughly half of the world's industrial output. The US, of course, had been spared industrial and civilian devastation. Further, much of its pre-war industry had been converted to wartime usage. As a result, with its industrial and civilian base in much better shape than most of the world, the U.S. embarked on an economic expansion unseen in human history. U.S. gross domestic product increased from $228 billion in 1945 to just under $1.7 trillion in 1975. Denazification In 1951 several laws were passed, ending the denazification. As a result, many people with a former Nazi past ended up again in the political apparatus of West Germany. West German President Walter Scheel and Chancellor Kurt Georg Kiesinger were both former members of the Nazi Party. In 1957, 77% of the West German Ministry of Justice's senior officials were former Nazi Party members. Konrad Adenauer's State Secretary Hans Globke had played a major role in drafting antisemitic Nuremberg Race Laws in Nazi Germany. The historical consensus is that West Germany's efforts at denazification proved wanting.: 191 Unexploded ordnance Unexploded ordnance continues to pose a danger in the present day. In 2017, fifty thousand people were evacuated from Hanover so World War II era bombs could be defused. As of 2023, it is still thought that thousands of unexploded bombs remain from World War II. Environment When World War II ended, scientists did not have procedures for safe disposal of chemical arsenals. At the direction of the UK, US and Russia, chemical weapons were loaded onto ships by the metric ton and dumped into the sea. The exact locations of the dumping are not known due to poor record keeping, but it is estimated that 1 million metric tons of chemical weapons remain on the ocean floor where they are rusting and pose the risk of leaks. Sulfur mustard exposure has been reported in some parts of coastal Italy and sulfur mustard bombs have been found as far as Delaware, likely brought in with the shellfish cargo. See also References Bibliography Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/File:Difference_engine_plate_1853.jpg] | [TOKENS: 107] |
File:Difference engine plate 1853.jpg This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer. https://creativecommons.org/publicdomain/mark/1.0/PDMCreative Commons Public Domain Mark 1.0falsefalse File history Click on a date/time to view the file as it appeared at that time. File usage The following 3 pages use this file: Global file usage The following other wikis use this file: |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Cut_(graph_theory)] | [TOKENS: 933] |
Contents Cut (graph theory) In graph theory, a cut is a partition of the vertices of a graph into two disjoint subsets. Any cut determines a cut-set, the set of edges that have one endpoint in each subset of the partition. These edges are said to cross the cut. In a connected graph, each cut-set determines a unique cut, and in some cases cuts are identified with their cut-sets rather than with their vertex partitions. In a flow network, an s–t cut is a cut that requires the source and the sink to be in different subsets, and its cut-set only consists of edges going from the source's side to the sink's side. The capacity of an s–t cut is defined as the sum of the capacity of each edge in the cut-set. Definition A cut C = (S, T) is a partition of V of a graph G = (V, E) into two subsets S and T. The cut-set of a cut C = (S, T) is the set {(u, v) ∈ E | u ∈ S, v ∈ T} of edges that have one endpoint in S and the other endpoint in T. If s and t are specified vertices of the graph G, then an s–t cut is a cut in which s belongs to the set S and t belongs to the set T. In an unweighted undirected graph, the size or weight of a cut is the number of edges crossing the cut. In a weighted graph, the value or weight is defined by the sum of the weights of the edges crossing the cut. A bond is a cut-set that does not have any other cut-set as a proper subset. Minimum cut A cut is minimum if the size or weight of the cut is not larger than the size of any other cut. The illustration on the right shows a minimum cut: the size of this cut is 2, and there is no cut of size 1 because the graph is bridgeless. The max-flow min-cut theorem proves that the maximum network flow and the sum of the cut-edge weights of any minimum cut that separates the source and the sink are equal. There are polynomial-time methods to solve the min-cut problem, notably the Edmonds–Karp algorithm. Maximum cut A cut is maximum if the size of the cut is not smaller than the size of any other cut. The illustration on the right shows a maximum cut: the size of the cut is equal to 5, and there is no cut of size 6, or |E| (the number of edges), because the graph is not bipartite (there is an odd cycle). In general, finding a maximum cut is computationally hard. The max-cut problem is one of Karp's 21 NP-complete problems. The max-cut problem is also APX-hard, meaning that there is no polynomial-time approximation scheme for it unless P = NP. However, it can be approximated to within a constant approximation ratio using semidefinite programming. Note that min-cut and max-cut are not dual problems in the linear programming sense, even though one gets from one problem to other by changing min to max in the objective function. The max-flow problem is the dual of the min-cut problem. Sparsest cut The sparsest cut problem is to bipartition the vertices so as to minimize the ratio of the number of edges across the cut divided by the number of vertices in the smaller half of the partition. This objective function favors solutions that are both sparse (few edges crossing the cut) and balanced (close to a bisection). The problem is known to be NP-hard, and the best known approximation algorithm is an O ( log n ) {\displaystyle O({\sqrt {\log n}})} approximation due to Arora, Rao & Vazirani (2009). Cut space The family of all cut sets of an undirected graph is known as the cut space of the graph. It forms a vector space over the two-element finite field of arithmetic modulo two, with the symmetric difference of two cut sets as the vector addition operation, and is the orthogonal complement of the cycle space. If the edges of the graph are given positive weights, the minimum weight basis of the cut space can be described by a tree on the same vertex set as the graph, called the Gomory–Hu tree. Each edge of this tree is associated with a bond in the original graph, and the minimum cut between two nodes s and t is the minimum weight bond among the ones associated with the path from s to t in the tree. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Outline_of_the_Python_programming_language] | [TOKENS: 126] |
Contents Outline of the Python programming language The following outline is provided as an overview of and topical guide to Python: Python is a general-purpose, interpreted, object-oriented, multi-paradigm, and dynamically typed programming language known for its emphasis on code readability and broad standard library. Python was created by Guido van Rossum and first released in 1991. It emphasizes code readability and developer productivity. What type of language is Python? History of Python General Python concepts Issues and limitations Python implementations Python toolchain Notable projects using Python Python development communities Example source code Python publications Python programmers Python conferences Python learning resources See also External links References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Jack_Cohen_(scientist)] | [TOKENS: 782] |
Contents Jack Cohen (biologist) Jack Cohen FRSB (19 September 1933 – 6 May 2019) was a British reproductive biologist also known for his science books and involvement with science fiction. Life Cohen was born 19 September 1933 in Norwich, but grew up in Stoke Newington. His father was killed shortly after the end of the Second World War, 1 September 1945. His grandfather was a rabbi and Cohen was an observant Jew in his youth. He continued to attend the synagogue for cultural reasons. He was married three times and had six children. Academic career Cohen studied at University College, Hull, where he obtained a BSc (external degree of the University of London) in 1954. He obtained his PhD in zoology at the same institution (by then Hull University) in 1957. He went to the University of Birmingham for post-doctoral work and was appointed lecturer in the Department of Zoology and Comparative physiology in 1959. He worked for a year at Harvard Medical School then returned to Birmingham as a senior lecturer in 1968, a position he held until 1987. His former students include Sir Paul Nurse, winner of the 2001 Nobel Prize for Medicine. In 1974 the University of Birmingham awarded him a DSc for his work. From 1987 to 1989 he was senior embryological advisor and manager of laboratories at the IVF/Infertility Clinic of a London private hospital. From 1995 to 1996 he was visiting professor at the Weizmann Institute, Israel. From 1996 to 2000 he was a consultant at the University of Warwick, jointly to the Ecosystems Unit of the Biology Dept and the Mathematics Institute. He was an honorary professor at the Mathematics Institute of the University of Warwick and a visiting professor at Durham Business School. He published in prestigious journals such as Nature and wrote textbooks such as Living Embryos – an Introduction to the Study of Animal Development (1967) and Reproduction (1977). His theory of sperm redundancy was important in studies of fertility and treatment of infertility. Other activities Cohen worked as a consultant for science fiction television shows and science fiction novels regarding the creation of plausible aliens. The writers who acknowledged his assistance included Anne McCaffrey for the Dragonriders of Pern; Harry Harrison for his Eden trilogy; Larry Niven, Jerry Pournelle and Steven Barnes for their Legacy of Heorot; James White of Sector General fame; David Gerrold for the Chtorr ecology; and Terry Pratchett for several works. Cohen and fellow University of Warwick researcher Ian Stewart, a mathematician, collaborated with Terry Pratchett to write four Science of Discworld books, which accompany his Discworld series. Pratchett made them both "Honorary Wizards of the Unseen University" at the same 1999 ceremony where the University of Warwick gave Pratchett an honorary degree. Anne McCaffrey dedicated All the Weyrs of Pern (1991) to Jack and Judy Cohen and credited Jack with making fact of her fiction. Cohen and Stewart also co-authored books on epistemology. Cohen was a member of the high IQ society Mensa. He was one of the small groups of British Mensans who persuaded science fiction author Isaac Asimov to visit the United Kingdom in June 1974. He had a long-standing interest in the design and natural balance of (particularly manmade) lake ecosystems, having designed new filtration systems but also led in reinstating Victorian designed systems at various locations around the UK. In 2009, he became a patron of the anti-circumcision charity NORM-UK. His hobbies, according to the author profiles in his books, included boomerang-throwing and keeping strange animals. Books References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Euclid_(programming_language)] | [TOKENS: 323] |
Contents Euclid (programming language) Euclid is an imperative programming language for writing verifiable programs. It was designed in the mid-1970s by Butler Lampson and James G. Mitchell at the Xerox PARC lab in collaboration with Jim Horning at the University of Toronto, Ralph L. London at USC ISI and Gerald J. Popek at UCLA. The implementation was led by Ric Holt at the University of Toronto and James Cordy was the principal programmer for the first implementation of the compiler. It was originally designed for the Motorola 6809 microprocessor. It was considered innovative for the time; the compiler development team had a $2 million budget over 2 years and was commissioned by the Defense Advanced Research Projects Agency of the U.S. Department of Defense and the Canadian Department of National Defence. It was used for a few years at I. P. Sharp Associates, MITRE Corporation, SRI International and various other international institutes for research in systems programming and secure software systems. Euclid is descended from Pascal, Mesa, Alphard, CLU, Gypsy, BCPL, Modula, LIS, and SUE. Functions in Euclid are closed scopes, may not have side effects, and must explicitly declare imports. Euclid also disallows gotos, floating point numbers, global assignments, nested functions and aliases, and none of the actual parameters to a function can refer to the same memory cell (which Euclid calls a "variable"). Euclid implements modules as types. Descendants of Euclid include the Concurrent Euclid programming language and the Turing programming language. External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-barlow88-31] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://techcrunch.com/2026/02/20/toy-story-5-takes-aim-at-creepy-ai-toys-im-always-listening/] | [TOKENS: 652] |
Save up to $680 on your pass with Super Early Bird rates. REGISTER NOW. Save up to $680 on your Disrupt 2026 pass. Ends February 27. REGISTER NOW. Latest AI Amazon Apps Biotech & Health Climate Cloud Computing Commerce Crypto Enterprise EVs Fintech Fundraising Gadgets Gaming Google Government & Policy Hardware Instagram Layoffs Media & Entertainment Meta Microsoft Privacy Robotics Security Social Space Startups TikTok Transportation Venture Staff Events Startup Battlefield StrictlyVC Newsletters Podcasts Videos Partner Content TechCrunch Brand Studio Crunchboard Contact Us Posted: ‘Toy Story 5’ takes aim at creepy AI toys: ‘I’m always listening’ When the first Toy Story movie came out in 1995, Google didn’t exist yet and Apple was on the verge of bankruptcy. No one could have predicted that over 30 years later, Pixar would still be making Toy Story movies, nor could anyone have known that the latest installation in the franchise would pit Buzz Lightyear and a balding Woody against an evil AI tablet called Lilypad. But sure enough, “Toy Story 5” confronts old-school toys like Mrs. Potato Head, Rex, and Slinky Dog against the sinister threat of technology. The trailer shows Bonnie, the young girl who inherited Andy’s toys when he left for college in “Toy Story 2,” playing outside with her toys when a surprise package with the Lilypad tablet arrives for her. She becomes completely enraptured by the tablet, not even looking up from the screen when her parents tell her that screen time is over. The “Toy Story 5” trailer makes the Lilypad — or, Lily — out to be a sinister villain. When Jessie confronts the tablet about Bonnie’s well-being, Lily seems not to be paying attention, so the cowgirl demands that the tablet listen to her. “I’m always listening,” Lily says ominously, regurgitating Jessie’s impassioned speech in a computerized tone… and then translates it into Spanish. “Tech’s invaded our house,” Jessie tells Woody. “I’m losing Bonnie to this device.” Woody replies, “Toys are for play, but tech is for everything.” Could “Toy Story 5” pull at the heartstrings of young children and get them to think twice about the consequences of excessive screen time? That may be a stretch. But at least it gives them something to watch that’s not as mind numbing as Cocomelon. Topics Save up to $680 on your pass before February 27.Meet investors. Discover your next portfolio company. Hear from 250+ tech leaders, dive into 200+ sessions, and explore 300+ startups building what’s next. Don’t miss these one-time savings. Subscribe for the industry’s biggest tech news Every weekday and Sunday, you can get the best of TechCrunch’s coverage. TechCrunch Mobility is your destination for transportation news and insight. Startups are the core of TechCrunch, so get our best coverage delivered weekly. Provides movers and shakers with the info they need to start their day. By submitting your email, you agree to our Terms and Privacy Notice. Related Latest in Media & Entertainment © 2025 TechCrunch Media LLC. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/9/11] | [TOKENS: 17701] |
Contents September 11 attacks The September 11 attacks,[f] colloquially known as 9/11,[g] were four coordinated Islamist terrorist suicide attacks by al-Qaeda against the United States in 2001. Nineteen terrorists hijacked four airliners, then flew one into each of the Twin Towers at the World Trade Center in New York City. The third crashed into the Pentagon, the headquarters of the U.S. Department of Defense, in Arlington County, Virginia. The fourth plane crashed in a rural Pennsylvania field during a passenger revolt. In response to the attacks, the United States launched the global war on terror over two decades, with the mission to eliminate hostile groups deemed terrorist organizations, and the governments purported to support them. Ringleader and hijacker Mohamed Atta flew American Airlines Flight 11 into the North Tower of the World Trade Center complex at 8:46 a.m. At 9:03 a.m.,[h] hijacker Marwan al-Shehhi flew United Airlines Flight 175 into the South Tower. Both collapsed within an hour and forty-two minutes,[i] destroying the remaining five structures in the complex. Hijacker Hani Hanjour flew American Airlines Flight 77 into the Pentagon at 9:37 a.m., causing a partial collapse. United Airlines Flight 93, flown by hijacker Ziad Jarrah, was believed to target either the United States Capitol or the White House. Alerted to the previous attacks, the passengers revolted and the hijackers crashed the aircraft into a field near Shanksville, Pennsylvania, at 10:03 a.m. The Federal Aviation Administration ordered a ground stop for all traffic in U.S. airspace, requiring all airborne aircraft to return to their point of origin or divert to Canada. The actions undertaken in Canada to support incoming aircraft and their occupants were collectively titled Operation Yellow Ribbon. That evening, the Central Intelligence Agency informed President George W. Bush that its Counterterrorism Center had identified the attacks as having been the work of al-Qaeda under Osama bin Laden. The United States responded by launching the war on terror and invading Afghanistan. NATO's invocation of Article 5 of the North Atlantic Treaty—its only usage to date—called upon allies to fight al-Qaeda. As U.S. and allied invasion forces swept through Afghanistan, bin Laden eluded them. He denied any involvement until 2004, when excerpts of a taped statement in which he accepted responsibility for the attacks were released. Al-Qaeda's cited motivations included U.S. support of Israel, the presence of U.S. military bases in Saudi Arabia and sanctions against Iraq. The nearly decade-long manhunt for bin Laden concluded in May 2011, when he was killed during a U.S. military raid in Abbottabad, Pakistan. The war in Afghanistan continued for another eight years. The attacks killed 2,977 people, injured thousands more,[j] and gave rise to long-term health consequences, while causing at least US$10 billion in infrastructure and property damage. It remains the deadliest terrorist attack in history, as well as the deadliest incident for firefighters and law enforcement personnel in American history, killing 343 and 72 members, respectively. The crashes of Flight 11 and Flight 175 were the deadliest aviation disasters of all time, and the collision of Flight 77 with the Pentagon resulted in the fourth-highest number of ground fatalities in a plane crash in history. The destruction of the World Trade Center and its environs seriously harmed the U.S. economy and induced global market shocks. Many other countries strengthened anti-terrorism legislation and expanded their powers of law enforcement and intelligence agencies. The total number of deaths caused by the attacks, combined with the death tolls from the conflicts they directly incited, has been estimated by the Costs of War Project to be over 4.5 million. Cleanup of the World Trade Center site (colloquially known as "Ground Zero") was completed in May 2002, while the Pentagon was repaired within a year. After delays in the design of a replacement complex, six new buildings were planned to replace the lost towers at the World Trade Center site, along with a museum and memorial dedicated to those who were killed or injured in the attacks. The tallest building, One World Trade Center, began construction in 2006, and opened in 2014. Memorials to the attacks include the National September 11 Memorial & Museum in New York City, the Pentagon Memorial in Arlington County, Virginia, and the Flight 93 National Memorial at the Pennsylvania crash site. Background In 1996, Osama bin Laden, by then a former Saudi Arabian citizen who led the Islamist militant organization al-Qaeda, issued his first fatwā, which declared holy war against the United States and demanded the expulsion of all American troops stationed in the Arabian Peninsula. An adherent of Islam, bin Laden interpreted Muhammad as banning non-Muslims from the Peninsula. He thus considered the U.S. troop presence a provocation to all Muslims. Regarding his holy war Bin-laden stated: "We do not differentiate between those dressed in military uniforms, and civilians; they are all targets of this fatwa". Bin Laden was living in Sudan prior to 1996, when the Sudanese government exiled him after Saudi and U.S. pressure. Bin Laden returned to Afghanistan, which was ran by the Taliban. They allowed al-Qaeda to use the country as its base of operations. He then orchestrated the 9/11 attacks on the U.S. in 2001. He personally told his subordinate participants to target the World Trade Center in New York City and the Pentagon in Arlington, Virginia, with hijacked planes. He initially denied his role in the attacks, but later recanted his denial. Al Jazeera broadcast a statement by bin Laden on September 16, 2001: "I stress that I have not carried out this act, which appears to have been carried out by individuals with their own motivation". In November 2001, U.S. forces recovered a videotape in which bin Laden, talking to Khaled al-Harbi, admitted foreknowledge of the attacks. In a 2004 video, he unambiguously confirmed that he had organized 9/11. A video by Al Jazeera in 2006 shows bin Laden with one of the attacks' chief planners, Ramzi bin al-Shibh, as well as hijackers, Hamza al-Ghamdi and Wail al-Shehri, amidst making preparations for the attacks. Bin Laden's 1996 fatwā, and similar statements that called for the killing of Americans, are seen by investigators as evidence of his motivation for the attacks. In a second fatwā in 1998, he outlined more of his objections to American foreign policy, such as American support of Israel, and the U.S. and other nations' sanctions against Iraq, condemning the "protracted blockade" which he said constitute a declaration of war against "Allah, his messenger, and Muslims". He believed that the U.S. was being directed by an international Jewish conspiracy into killing as many Muslims as possible, and that all Muslims must wage a defensive war against the U.S. to combat its aggression against them. This was to be done until the aggression ceased. Bin Laden further believed it would send a message to the American people, forcing the U.S. to reevaluate their policies. In a 1998 interview with American journalist John Miller, he stated: American history does not distinguish between civilians and military, not even women and children. They are the ones who used bombs against Nagasaki. Can these bombs distinguish between infants and military? America does not have a religion that will prevent it from destroying all people. So we tell the Americans as people and we tell the mothers of soldiers and American mothers in general that if they value their lives and the lives of their children, to find a nationalist government that will look after their interests and not the interests of the Jews. After 9/11, bin Laden maintained that women and children were not targeted in the attack—rather, symbols of America's "economic and military power". In December 2001, a video of bin Laden was released, in which he stops short of admitting responsibility for 9/11, but says: It has become clear that the West in general and America in particular have an unspeakable hatred for Islam. [...] It is the hatred of crusaders. Terrorism against America deserves to be praised because it was a response to injustice, aimed at forcing America to stop its support for Israel, which kills our people. [...] We say that the end of the United States is imminent, whether bin Laden or his followers are alive or dead, for the awakening of the Muslim ummah [nation] has occurred. [...] It is important to hit the economy [of the U.S.], which is the base of its military power... If the economy is hit, they will become reoccupied. In a 2002 manifesto, he listed multiple factors implied to have motivated 9/11, including U.S. support of: Israel, against Lebanon during their occupation of Southern Lebanon, and against Palestinians during the Second Intifada; the Philippines, against Muslim militants; Russia, against Muslim militants; and India, against Muslim civilians in Kashmir. He also listed the former U.S.-led intervention against Muslim militants in Somalia; pollution caused by the U.S.; and the U.S.' refusal to ratify the Kyoto Protocol. In the 2004 video, he said he was inspired to destroy the World Trade Center's Twin Towers after watching the destruction of towers in Lebanon by Israel during the 1982 Lebanon War. "God knows it did not cross our minds to attack the towers, but after [witnessing] [...] the destroyed towers in Lebanon, it occurred to me punish the unjust the same way: to destroy towers in America, so it could taste some of what we are tasting, and to stop killing our children and women." Other motives have been suggested in addition to those stated by him and other al-Qaeda members. Some authors suggested the "humiliation" that resulted from the Islamic world falling behind the Western world—this discrepancy was rendered especially visible by globalization—as well as a desire to provoke the U.S. into a broader war against the Islamic world in the hope of motivating more allies to support al-Qaeda. Similarly, others have argued the 9/11 attacks were a strategic move to provoke America into a war that would incite a pan-Islamic revolution. In April 2002, Yosri Fouda of Al Jazeera met al-Qaeda member Khalid Sheikh Mohammed and Ramzi bin al-Shibh, who were in hiding, and they admitted to him their involvement in the attacks. Before 9/11, Mohammed had been an organizer and financier of al-Qaeda's 1993 bombing of the World Trade Center, and he was the uncle of Ramzi Yousef, the lead bomber in that attack. In the Bojinka plot, Mohammed and Yousef then moved on to plan a new terrorist attack planned for January 1995. Despite its failure, and Yousef's imprisonment by the U.S. afterwards, the plot would influence the later 9/11 attacks. The 2004 9/11 Commission Report determined that the animosity which Mohammed, the principal architect of 9/11, felt towards the U.S. had stemmed from his "violent disagreement with U.S. foreign policy favoring Israel". Documents seized during the 2011 operation that killed bin Laden included notes handwritten by bin Laden in September 2002 with the heading "The Birth of the Idea of September 11". He describes how he was inspired by the crash of EgyptAir Flight 990 in October 1999, which was deliberately crashed by co-pilot Gameel Al-Batouti, killing over 200 passengers. "This is how the idea of 9/11 was conceived and developed in my head, and that is when we began the planning" bin Laden continued, adding that no one but Mohammed Atef and Abu al-Khair knew about it at the time. The 9/11 Commission Report identified Khalid Sheikh Mohammed as the architect of 9/11, but he is not mentioned in bin Laden's notes. During the trial of alleged 9/11 conspirator Zacarias Moussaoui, the U.S. government identified five people as having been completely aware of the operation's details; Osama bin Laden, Khalid Sheikh Mohammed, Mohammed Atef, Abu Turab al-Urduni, and Ramzi bin al-Shibh. The attacks were conceived by Khalid Sheikh Mohammed, who first presented it to bin Laden in 1996. Many targets were listed that al-Qaeda hijackers could crash planes into, including the Library Tower (now the U.S. Bank Tower) in Los Angeles. Bin Laden rejected the plan for being too elaborate.[page needed] Al-Qaeda's first attacks against the U.S. after Bin Laden's 1998 fatwā were the 1998 African embassy bombings. In late 1998 or early 1999, bin Laden approved Mohammed to go forward with a new version of the 1996 plan. Bin Laden provided leadership and financial support, and was involved in selecting participants. Atef provided operational support, including target selections and helping arrange travel for the hijackers. He initially selected Nawaf al-Hazmi and Khalid al-Mihdhar, both experienced jihadists who had fought in the Bosnian war. The two arrived in the United States in mid-January 2000. In early 2000, they took flying lessons in San Diego, California. Both spoke little English. They performed poorly in flying lessons, and so they eventually served as secondary "muscle" hijackers. The Hamburg cell in Germany included Islamists who came to be key operatives in the 9/11 attacks. In late 1999, cell members bin al-Shibh, Mohamed Atta, Marwan al-Shehhi, and Ziad Jarrah arrived to meet al-Qaeda in Afghanistan. Bin Laden selected these men because they were educated, could speak English, and had experience living in the West. New recruits were routinely screened for special skills and al-Qaeda leaders consequently discovered that Hani Hanjour already had a commercial pilot's license. Hanjour arrived in San Diego on December 8, 2000, joining al-Hazmi.: 6–7 They soon left for Arizona, where Hanjour took refresher training.: 7 Marwan al-Shehhi arrived at the end of May 2000, while Atta arrived on June 3, 2000, and Jarrah arrived on June 27, 2000.: 6 Bin al-Shibh applied several times for a visa to the United States, but as a Yemeni, he was rejected out of concerns he would overstay his visa.: 4, 14 Bin al-Shibh stayed in Hamburg, providing coordination between Atta and Mohammed.: 16 The three Hamburg cell members all took pilot training in South Florida at Huffman Aviation.: 6 In the spring of 2001, the secondary hijackers began arriving in the United States. In July 2001, Atta met with bin al-Shibh in Spain, where they coordinated details of the plot, including final target selection. Bin al-Shibh passed along bin Laden's wish for the attacks to be carried out as soon as possible. Some of the hijackers received passports from corrupt Saudi officials who were family members or used fraudulent passports to gain entry. In late 1999, al-Qaeda associate Walid bin Attash ("Khallad") contacted al-Mihdhar and told him to meet in Kuala Lumpur, Malaysia; al-Hazmi and Abu Bara al Yemeni would also be in attendance. The NSA intercepted a telephone call mentioning the meeting, al-Mihdhar, and the name "Nawaf" (al-Hazmi); while the agency feared "Something nefarious might be afoot", it took no further action. The CIA had already been alerted by Saudi intelligence about al-Mihdhar and al-Hazmi being al-Qaeda members. A CIA team broke into al-Mihdhar's Dubai hotel room and discovered that Mihdhar had a U.S. visa. While Alec Station alerted intelligence agencies worldwide, it did not share this information with the FBI. The Malaysian Special Branch observed the January 5, 2000, meeting of the two al-Qaeda members and informed the CIA that al-Mihdhar, al-Hazmi, and Khallad were flying to Bangkok, but the CIA never notified other agencies of this, nor did it ask the State Department to put al-Mihdhar on its watchlist. An FBI liaison asked permission to inform the FBI of the meeting but was told: "This is not a matter for the FBI". By late June, senior counter-terrorism official Richard Clarke and CIA director George Tenet were "convinced that a major series of attacks was about to come", although the CIA believed the attacks would likely occur in Saudi Arabia or Israel. In early July, Clarke put domestic agencies on "full alert", telling them, "Something spectacular is going to happen here, and it's going to happen soon". He asked the FBI and the State Department to alert the embassies and police departments, and the Defense Department to go to "Threat Condition Delta". Clarke later wrote: Somewhere in CIA there was information that two known al Qaeda terrorists had come into the United States. Somewhere in the FBI, there was information that strange things had been going on at flight schools in the United States. [...] They had specific information about individual terrorists from which one could have deduced what was about to happen. None of that information got to me or the White House. [...] by July , with word spreading of a coming attack, a schism emerged among the senior leadership of al Qaeda. Several senior members reportedly agreed with Mullah Omar. Those who reportedly sided with bin Ladin included Atef, Sulayman Abu Ghayth, and Khalid Sheikh Mohammed. But those said to have opposed him were weighty figures in the organization-including Abu Hafs the Mauritanian, Sheikh Saeed al Masri, and Sayf al Adl. One senior al Qaeda operative claims to recall Bin Ladin arguing that attacks against the United States needed to be carried out immediately to support insurgency in the Israeli-occupied territories and protest the presence of U.S. forces in Saudi Arabia. On July 13, Tom Wilshire, a CIA agent assigned to the FBI's international terrorism division, emailed his superiors at the CIA's Counterterrorism Center (CTC) requesting permission to inform the FBI that Hazmi was in the country and that Mihdhar had a U.S. visa. The CIA never responded. The same day, Margarette Gillespie, an FBI analyst working in the CTC, was told to review material about the Malaysia meeting. She was not told of the participant's presence in the U.S. The CIA gave Gillespie surveillance photos of Mihdhar and Hazmi from the meeting to show to FBI counterterrorism but did not tell her their significance. The Intelink database informed her not to share intelligence material with criminal investigators. When shown the photos, the FBI refused more details on their significance, and they were not given Mihdhar's date of birth or passport number. In late August 2001, Gillespie told the INS, the State Department, the Customs Service, and the FBI to put Hazmi and Mihdhar on their watchlists, but the FBI was prohibited from using criminal agents in searching for the duo, hindering their efforts. Also in July, a Phoenix-based FBI agent sent a message to FBI headquarters, Alec Station, and FBI agents in New York alerting them to "the possibility of a coordinated effort by Osama bin Laden to send students to the United States to attend civil aviation universities and colleges". The agent, Kenneth Williams, suggested the need to interview flight school managers and identify all Arab students seeking flight training. In July, Jordan alerted the U.S. that al-Qaeda was planning an attack on the U.S.; "months later", Jordan notified the U.S. that the attack's codename was "The Big Wedding" and that it involved airplanes. On August 6, 2001, the CIA's Presidential Daily Brief, designated "For the President Only", was entitled Bin Ladin Determined To Strike in US. The memo noted that FBI information "indicates patterns of suspicious activity in this country consistent with preparations for hijackings or other types of attacks". In mid-August, one Minnesota flight school alerted the FBI about Zacarias Moussaoui, who had asked "suspicious questions". The FBI found that Moussaoui was a radical who had traveled to Pakistan, and the INS arrested him for overstaying his French visa. Their request to search his laptop was denied by FBI headquarters due to the lack of probable cause. The failures in intelligence-sharing were attributed to 1995 Justice Department policies limiting intelligence-sharing, combined with CIA and NSA reluctance to reveal "sensitive sources and methods" such as tapped phones. Testifying before the 9/11 Commission in April 2004, then-Attorney General John Ashcroft recalled that the "single greatest structural cause for the September 11th problem was the wall that segregated or separated criminal investigators and intelligence agents". Clarke also wrote: "[T]here were ... failures to get information to the right place at the right time". Attacks Early on the morning of Tuesday, September 11, 2001, nineteen hijackers took control of four commercial airliners (two Boeing 757s and two Boeing 767s). Large planes with long flights were selected for hijacking because they would have more fuel. * Eastern Daylight Time (UTC−04:00)† Excluding hijackers§ Including emergency workers‡ Including hijackers At 7:59 a.m., American Airlines Flight 11 took off from Logan International Airport in Boston. Fifteen minutes into the flight, five hijackers armed with boxcutters took over the plane, injuring at least three people (and possibly killing one) before forcing their way into the cockpit. The terrorists also displayed an apparent explosive and sprayed mace into the cabin, to frighten the hostages into submission and further hinder resistance. Back at Logan, United Airlines Flight 175 took off at 8:14 a.m. Hundreds of miles southwest at Dulles International Airport, American Airlines Flight 77 left the runway at 8:20 a.m. Flight 175's journey proceeded normally for 28 minutes until 8:42 am, when a group of five hijacked the plane, murdering both pilots and stabbing several crew members before assuming control of the aircraft. These hijackers also used bomb threats to instill fear into the passengers and crew, also spraying "tear gas, pepper spray or another irritant" in the cabin to force passengers and flight attendants to the rear of the cabin. Concurrently, United Airlines Flight 93 departed from Newark International Airport in New Jersey; originally scheduled to pull away from the gate at 8:00 a.m., the plane was running 42 minutes late. At 8:46 a.m., Flight 11 was deliberately crashed into the north face of the World Trade Center's North Tower between the 93rd and 99th floors. The initial presumption by many was that it was an accident. At 8:51 a.m., American Airlines Flight 77 was also taken over by five hijackers who forcibly entered the cockpit 31 minutes after take-off. Although they were equipped with knives, there were no reports of anyone on board being stabbed, nor did the two people who made phone calls mention the use of mace or a bomb threat. Flight 175 was flown into the South Tower's southern facade (2 WTC) between the 77th and 85th floors at 9:03 a.m.,[h] demonstrating that the first crash was a deliberate act of terrorism. Four men aboard Flight 93 struck suddenly, killing at least one passenger, after having waited 46 minutes—a holdup that proved disastrous for the terrorists when combined with the delayed takeoff. They stormed the cockpit and seized control of the plane at 9:28 a.m., turning the plane eastbound towards Washington, D.C. Much like their counterparts on the first two flights, the fourth team used bomb threats and filled the cabin with mace. Nine minutes after Flight 93 was hijacked, Flight 77 crashed into the west side of the Pentagon at 9:37 a.m. Because of the two delays, the passengers and crew of Flight 93 had time to learn of the previous attacks through phone calls to the ground, and, as a result, an uprising was hastily organized to take control of the aircraft at 9:57 a.m. Within minutes, passengers had fought their way to the front of the cabin and began breaking down the cockpit door. Fearing their captives would gain the upper hand, the hijackers rolled the plane and pitched it into a nosedive, crashing into a field near Shanksville, Pennsylvania, southeast of Pittsburgh, at 10:03:11 a.m. The plane was about twenty minutes away from reaching D.C. at the time of the crash, and its target is believed to have been either the Capitol Building or the White House. Some passengers and crew who called from the aircraft using the cabin air phone service and mobile phones provided details: several hijackers were aboard each plane; they used mace, tear gas, or pepper spray to overcome attendants; and some people aboard had been stabbed. Reports indicated hijackers stabbed and killed pilots, flight attendants, and one or more passengers. According to the 9/11 Commission's final report, the hijackers had recently purchased multi-function hand tools and assorted Leatherman-type utility knives with locking blades (which were not forbidden to passengers at the time), but these were not found among the possessions left behind by the hijackers. A flight attendant on Flight 11, a passenger on Flight 175, and passengers on Flight 93 said the hijackers had bombs, but one of the passengers said he thought the bombs were fake. The FBI found no traces of explosives at the crash sites, and the 9/11 Commission concluded that the bombs were probably fake. On at least two of the hijacked flights—American 11 and United 93—the terrorists claimed over the PA system that they were taking hostages and were returning to the airport to have a ransom demand met, a clear attempt to prevent passengers from fighting back. Both attempts failed, however, as both hijacker pilots in these instances (Mohamed Atta and Ziad Jarrah, respectively) mistakenly transmitted their messages to ATC instead of the people on the plane as intended, tipping off the flight controllers that the planes had been hijacked. Three buildings in the World Trade Center collapsed due to fire-induced structural failure. Although the South Tower was struck around seventeen minutes after the North Tower, the plane's impact zone was far lower, at a much faster speed, and into a corner, with the unevenly-balanced additional structural weight causing it to collapse first at 9:59 a.m.,: 80 : 322 having burned for exactly 56 minutes[o] in the fire caused by the crash of United Airlines Flight 175 and the explosion of its fuel. The North Tower lasted another 29 minutes and 24 seconds before collapsing at 10:28 a.m.,[p] one hour, forty-one minutes, and fifty-three seconds[i] after being struck by American Airlines Flight 11. When the North Tower collapsed, debris fell on the nearby 7 World Trade Center building (7 WTC), damaging the building and starting fires. These fires burned for nearly seven hours, compromising the building's structural integrity, and 7 WTC collapsed at 5:21 p.m. The west side of the Pentagon sustained significant damage. At 9:42 a.m., the Federal Aviation Administration (FAA) grounded all civilian aircraft within the continental U.S., and civilian aircraft already in flight were told to land immediately. All international civilian aircraft were either turned back or redirected to airports in Canada or Mexico, and were banned from landing on United States territory for three days. The attacks created widespread confusion among news organizations and air traffic controllers. Among unconfirmed and often contradictory news reports aired throughout the day, one of the most prevalent claimed a car bomb had been detonated at the U.S. State Department's headquarters in Washington, D.C. Another jet (Delta Air Lines Flight 1989) was suspected of having been hijacked, but the aircraft responded to controllers and landed safely in Cleveland, Ohio. In an April 2002 interview, Khalid Sheikh Mohammed and Ramzi bin al-Shibh, who are believed to have organized the attacks, said Flight 93's intended target was the United States Capitol, not the White House. During the planning stage of the attacks, Mohamed Atta (Flight 11's hijacker and pilot) thought the White House might be too tough a target and sought an assessment from Hani Hanjour (who hijacked and piloted Flight 77). Mohammed said al-Qaeda initially planned to target nuclear installations rather than the World Trade Center and the Pentagon, but decided against it, fearing things could "get out of control". Final decisions on targets, according to Mohammed, were left in the hands of the pilots. If any pilot could not reach his intended target, he was to crash the plane. The attack on the World Trade Center's North Tower alone[q] made 9/11 the deadliest act of terrorism in history. Taken together, the four crashes killed 2,996 people (including the hijackers) and injured thousands more. The death toll included 265 on the four planes (from which there were no survivors); 2,606 in the World Trade Center and the surrounding area; and 125 at the Pentagon. Most who died were civilians, as well as 343 firefighters, 72 law enforcement officers, 55 military personnel, and the 19 terrorists. More than 90 countries lost citizens in the attacks. In New York City, more than 90% of those who died in the towers had been at or above the points of impact. In the North Tower, between 1,344 and 1,402 people were at, above or one floor below the point of impact and all died. Hundreds were killed instantly when the plane struck. The estimated 800 people who survived the impact were trapped and died in the fires or from smoke inhalation, fell or jumped from the tower to escape the smoke and flames, or were killed in the building's collapse. The destruction of all three staircases in the North Tower when Flight 11 hit made it impossible for anyone from the impact zone upward to escape. 107 people not trapped by the impact died. When Flight 11 struck between floors 93 and 99, the 92nd floor was rendered inescapable: the crash severed all elevator shafts while falling debris blocked the stairwells, ensuring the deaths of all 69 workers on the floor. In the South Tower, around 600 people were on or above the 77th floor when Flight 175 struck; few survived. As with the North Tower, hundreds were killed at the moment of impact. Unlike those in the North Tower, the estimated 300 survivors of the crash were not technically trapped, but most were either unaware that a means of escape still existed or were unable to use it. One stairway, Stairwell A, narrowly avoided being destroyed, allowing 14 people located on the floors of impact (including Stanley Praimnath, a man who saw the plane coming at him) and four more from the floors above to escape. New York City 9-1-1 operators who received calls from people inside the tower were not well informed of the situation as it rapidly unfolded and as a result, told callers not to descend the tower on their own. In total, 630 people died in the South Tower, fewer than half the number killed in the North Tower. Of the 100–200 people witnessed jumping or falling to their deaths, only three recorded sightings were from the South Tower.: 86 Casualties in the South Tower were significantly reduced because some occupants decided to leave the building immediately following the first crash, and because Eric Eisenberg, an executive at AON Insurance, decided to evacuate the floors occupied by AON (92 and 98–105) following the impact of Flight 11. The 17-minute gap allowed over 900 of the 1,100 AON employees present to evacuate from above the 77th floor before the South Tower was struck; Eisenberg was among the nearly 200 who did not escape. Similar pre-impact evacuations were carried out by Fiduciary Trust, CSC, and Euro Brokers, all of whom had offices on floors above the point of impact. The failure to order a full evacuation of the South Tower after the first plane crash into the North Tower was described by USA Today as "one of the day's great tragedies". As exemplified in the photograph The Falling Man, more than 200 people fell to their deaths from the burning towers, most of whom were forced to jump to escape the extreme heat, fire and smoke. Some occupants of each tower above the point of impact made their way toward the roof in the hope of helicopter rescue, but the roof access doors were locked. No plan existed for helicopter rescues, and the combination of roof equipment, thick smoke and intense heat prevented helicopters from approaching. At the World Trade Center complex, 414 emergency workers died as they tried to rescue people and fight fires, while another law enforcement officer was killed when United 93 crashed. 343 firefighters of the New York City Fire Department (FDNY) died, including a chaplain and two paramedics. 23 officers of New York City Police Department (NYPD) died. 37 officers of the Port Authority Police Department (PAPD) had died. Eight emergency medical technicians and paramedics from private emergency medical services units were killed. Almost all of the emergency personnel who died at the scene were killed as a result of the towers collapsing, with the exception of one who was struck by a civilian falling from the South Tower. 658 employees from Cantor Fitzgerald L.P., an investment bank on the North Tower's 101st–105th floors, died, considerably more than any other employer. 358 employees from Marsh Inc., located immediately below Cantor Fitzgerald on floors 93–100, died, and 176 employees from Aon Corporation died. The National Institute of Standards and Technology (NIST) estimated that about 17,400 civilians were in the World Trade Center complex at the time of the attacks.: xxxiii Turnstile counts from the Port Authority suggest 14,154 people were typically in the Twin Towers by 8:45 a.m. Most people below the impact zone safely evacuated. In Arlington County, Virginia, 125 Pentagon workers died when Flight 77 crashed into the building's western side. Seventy were civilians and 55 were military personnel, many of whom worked for the United States Army or the United States Navy. 47 civilian employees, six civilian contractors, and 22 soldiers working for the Army died, while six civilian employees, three civilian contractors, and 33 sailors working for the Navy died. Seven Defense Intelligence Agency (DIA) civilian employees and one Office of the Secretary of Defense contractor died. Timothy Maude, a Lieutenant General and Army Deputy Chief of Staff, was the highest-ranking military official killed at the Pentagon. Weeks after the attack, the death toll was estimated to be over 6,000, more than twice the number of deaths eventually confirmed. The city was only able to identify remains for about 1,600 of the World Trade Center victims. The medical examiner's office collected "about 10,000 unidentified bone and tissue fragments that cannot be matched to the list of the dead". Bone fragments were still being found in 2006 by workers who were preparing to demolish the damaged Deutsche Bank Building. In 2010, a team of anthropologists and archaeologists searched for human remains and personal items at the Fresh Kills Landfill, where 72 more human remains were recovered, bringing the total found to 1,845. As of 2011, DNA profiling was ongoing in an attempt to identify additional victims. In 2014, three coffin-size cases carrying 7,930 unidentified remains were transferred to a medical examiner's repository located at the same site as the National September 11 Memorial & Museum. Victims' families are permitted to visit a private "reflection room" which is closed to the public. The choice to place the remains in an underground area attached to a museum has been controversial; families of some victims have attempted to have the remains instead interred in a separate, above-ground monument. In August 2017, the 1,641st victim was identified as a result of newly available DNA technology, and a 1,642nd during July 2018. Three more victims were identified in October 2019, two in September 2021 and an additional two in September 2023. As of 2025, 1,103 victims remain unidentified, amounting to 40% of the deaths in the World Trade Center attacks. On September 25, 2023, the FDNY reported that the department had now lost the same number of members to 9/11-related illnesses as it did on the day of the attacks. The Twin Towers, Marriott World Trade Center (3 WTC), 7 WTC, and St. Nicholas Greek Orthodox Church were destroyed. The U.S. Customs House (6 World Trade Center), 4 World Trade Center, 5 World Trade Center, and both pedestrian bridges connecting buildings were severely damaged. All surrounding streets were in ruins. The last fires at the World Trade Center site were extinguished on December 20. The Deutsche Bank Building was damaged and was later condemned as uninhabitable because of toxic conditions; it was deconstructed starting in 2007. Buildings of the World Financial Center were damaged. The Borough of Manhattan Community College's Fiterman Hall was condemned due to extensive damage, and then reopened in 2012. Other neighboring buildings (including 90 West Street and the Verizon Building) suffered major damage but have been restored. World Financial Center buildings, One Liberty Plaza, the Millennium Hilton, and 90 Church Street had moderate damage and have been restored. Communications equipment on top of the North Tower was also destroyed, with only WCBS-TV maintaining a backup transmitter on the Empire State Building, but media stations were quickly able to reroute the signals and resume their broadcasts. The PATH train system's World Trade Center station was located under the complex and was demolished when the towers collapsed. The tunnels leading to Exchange Place station in Jersey City were flooded with water. The station was rebuilt as the $4 billion World Trade Center Transportation Hub, which reopened in March 2015. The Cortlandt Street station on the New York City Subway's IRT Broadway–Seventh Avenue Line was also in close proximity to the World Trade Center complex, and the entire station, along with the surrounding track, was reduced to rubble. The station was rebuilt and reopened to the public on September 8, 2018. The Pentagon was extensively damaged, causing one section of the building's E ring to collapse. As the Flight 77 approached the Pentagon, its wings knocked down light poles, and its right engine hit a power generator before crashing into the western side of the building. The plane hit the Pentagon at the first-floor level. The front part of the fuselage disintegrated on impact; debris from the tail section penetrated the furthest into the building, breaking through 310 feet (94 m) of the three outermost of the building's five rings. The New York City Fire Department (FDNY) deployed more than 200 units (approximately half of the department) to the World Trade Center. Their efforts were supplemented by off-duty firefighters, Hatzolah, and emergency medical technicians. The New York City Police Department (NYPD) sent its Emergency Service Units and other police personnel and deployed its aviation unit, which determined that helicopter rescues from the towers were not feasible. Numerous police officers of the Port Authority Police Department (PAPD) also participated in rescue efforts. Once on the scene, the FDNY, the NYPD, and the PAPD did not coordinate efforts and performed redundant searches for civilians. As conditions deteriorated, the NYPD aviation unit relayed information to police commanders, who issued orders for personnel to evacuate the towers; most NYPD officers were able to evacuate before the buildings collapsed. With separate command posts set up and incompatible radio communications between the agencies, warnings were not passed along to FDNY commanders. After the first tower collapsed, FDNY commanders issued evacuation warnings. Due to malfunctioning radio repeater systems, many firefighters never heard the evacuation orders. 9-1-1 dispatchers also received information from callers that was not passed along to commanders on the scene. Reactions The 9/11 attacks resulted in immediate responses, including domestic reactions; closings and cancellations; hate crimes; international responses; and military responses. Shortly after the attacks, the September 11th Victim Compensation Fund was created by an Act of Congress. The purpose of the fund was to compensate the victims of the attacks and their families with their agreement not to file lawsuits against the airlines involved. Legislation authorizes the fund to disburse a maximum of $7.375 billion, including operational and administrative costs, of U.S. government funds. The fund was set to expire by 2020 but was in 2019 prolonged to allow claims to be filed until October 2090. At 8:32 a.m., FAA officials were notified Flight 11 had been hijacked and they, in turn, notified the North American Aerospace Defense Command (NORAD). NORAD scrambled two F-15s from Otis Air National Guard Base in Massachusetts; they were airborne by 8:53 a.m. Because of slow and confused communication from FAA officials, NORAD had nine minutes' notice, and no notice about any of the other flights before they crashed. After both of the Twin Towers had been hit, more fighters were scrambled from Langley Air Force Base in Virginia at 9:30 a.m. At 10:20 am, Vice President Dick Cheney issued orders to shoot down any commercial aircraft that could be positively identified as being hijacked. These instructions were not relayed in time for the fighters to take action. Some fighters took to the air without live ammunition, knowing that to prevent the hijackers from striking their intended targets, the pilots might have to intercept and crash their fighters into the hijacked planes, possibly ejecting at the last moment. For the first time in U.S. history, the emergency preparedness plan Security Control of Air Traffic and Air Navigation Aids (SCATANA) was invoked, stranding tens of thousands of passengers across the world. Ben Sliney, in his first day as the National Operations Manager of the FAA, ordered that American airspace be closed to all international flights, causing about 500 flights to be turned back or redirected to other countries. Canada received 226 of the diverted flights and launched Operation Yellow Ribbon to deal with the large numbers of grounded planes and stranded passengers. The 9/11 attacks had immediate effects on the American people. Police and rescue workers from around the country traveled to New York City to help recover bodies from the remnants of the Twin Towers. Over 3,000 children lost a parent in the attacks. Blood donations across the U.S. surged in the weeks after 9/11. Following the attacks, Bush's approval rating increased to 90%. On September 20, he addressed the nation and a joint session of Congress regarding the events, the rescue and recovery efforts, and his intended response to the attacks. New York City mayor Rudy Giuliani's highly visible role resulted in praise in New York and nationally. Many relief funds were immediately set up to provide financial assistance to the survivors of the attacks and the victims' families. By the deadline for victims' compensation on September 11, 2003, 2,833 applications had been received from the families of those killed. Contingency plans for the continuity of government and the evacuation of leaders were implemented soon after the attacks. Congress was not told that the United States had been under a continuity of government status until February 2002. In the largest restructuring of the U.S. government in contemporary history, the United States enacted the Homeland Security Act of 2002, creating the U.S. Department of Homeland Security. Congress also passed the USA PATRIOT Act, saying it would help detect and prosecute terrorism and other crimes. Civil liberties groups have criticized the PATRIOT Act, saying it allows law enforcement to invade citizens' privacy and that it eliminates judicial oversight of law enforcement and domestic intelligence. To effectively combat future acts of terrorism, the National Security Agency (NSA) was given broad powers. The NSA commenced warrantless surveillance of telecommunications, which was sometimes criticized as permitting the agency "to eavesdrop on telephone and e-mail communications between the United States and people overseas without a warrant". In response to requests by intelligence agencies, the United States Foreign Intelligence Surveillance Court permitted an expansion of powers by the U.S. government in seeking, obtaining, and sharing information on U.S. citizens as well as non-Americans around the world. Six days after the attacks, President Bush made a public appearance at Washington, D.C.'s largest Islamic Center where he acknowledged the "incredibly valuable contribution" of American Muslims and called for them "to be treated with respect". Numerous incidents of harassment and hate crimes against Muslims and South Asians were reported in the days following the attacks. Sikhs were also targeted due to their use of turbans, which are stereotypically associated with Muslims. There were reports of attacks on mosques and other religious buildings (including the firebombing of a Hindu temple), and assaults on individuals, including one murder: Balbir Singh Sodhi, a Sikh mistaken for a Muslim, who was fatally shot on September 15, 2001, in Mesa, Arizona. Two dozen members of Osama bin Laden's family were urgently evacuated out of the country on a private charter plane under FBI supervision three days after the attacks. According to an academic study, people perceived to be Middle Eastern were as likely to be victims of hate crimes as followers of Islam during this time. The study also found a similar increase in hate crimes against people who may have been perceived as Muslims, Arabs, and others thought to be of Middle Eastern origin. A report by the South Asian American advocacy group South Asian Americans Leading Together documented media coverage of 645 bias incidents against Americans of South Asian or Middle Eastern descent between September 11 and 17, 2001. Crimes such as vandalism, arson, assault, shootings, harassment, and threats in numerous places were documented. Women wearing the hijab were also targeted. A poll of Arab-Americans in May 2002 found that 20% had personally experienced discrimination since September 11. A July 2002 poll of Muslim Americans found that 48% believed their lives had changed for the worse since September 11, and 57% had experienced an act of bias or discrimination. Following the September 11 attacks, many Pakistani Americans identified themselves as Indians to avoid potential discrimination and obtain jobs. By May 2002, there were 488 complaints of employment discrimination reported to the U.S. Equal Employment Opportunity Commission (EEOC). 301 of those were complaints from people fired from their jobs. Similarly, by June 2002, the U.S. Department of Transportation (DOT) had investigated 111 September 11th-related complaints from airline passengers purporting that their religious or ethnic appearance caused them to be singled out at security screenings, and an additional 31 complaints from people who alleged they were blocked from boarding airplanes on the same grounds. Muslim organizations in the United States were swift to condemn the attacks and called "upon Muslim Americans to come forward with their skills and resources to help alleviate the sufferings of the affected people and their families". These organizations included the Islamic Society of North America, American Muslim Alliance, American Muslim Council, Council on American-Islamic Relations, Islamic Circle of North America, and the Shari'a Scholars Association of North America. Along with monetary donations, many Islamic organizations launched blood drives and provided medical assistance, food, and shelter for victims. Curiosity about Islam increased after the attacks. As a result, many mosques and Islamic centers began holding open houses and participating in outreach efforts to educate non-Muslims about the faith. In the first 10 years after the attacks, interfaith community service increased from 8 to 20 percent and the percentage of U.S. congregations involved in interfaith worship doubled from 7 to 14 percent. The attacks were denounced by mass media and governments worldwide. Nations offered pro-American support and solidarity. Leaders in most Middle Eastern countries, as well as Libya and Afghanistan, condemned the attacks. Iraq was a notable exception, with an immediate official statement that "the American cowboys are reaping the fruit of their crimes against humanity". The government of Saudi Arabia officially condemned the attacks, but privately many Saudis favored bin Laden's cause. Although Palestinian Authority (PA) president Yasser Arafat also condemned the attacks, there were reports of celebrations of disputed size in the West Bank, Gaza Strip, and East Jerusalem. Palestinian leaders discredited news broadcasters that justified the attacks or showed celebrations, and the Authority claimed such celebrations do not represent the Palestinians' sentiment. Footage by CNN[vague] and other news outlets were suggested by a report originating at a Brazilian university to be from 1991; this was later proven to be a false accusation. As in the United States, the aftermath of the attacks saw tensions increase in other countries between Muslims and non-Muslims. United Nations Security Council Resolution 1368 condemned the attacks and expressed readiness to take all necessary steps to respond and combat terrorism in accordance with their Charter. Numerous countries introduced anti-terrorism legislation and froze bank accounts they suspected of al-Qaeda ties. Law enforcement and intelligence agencies in a number of countries arrested alleged terrorists. British Prime Minister Tony Blair said Britain stood "shoulder to shoulder" with the United States. In a speech to Congress nine days after the attacks, which Blair attended as a guest, President Bush declared "America has no truer friend than Great Britain". Subsequently, Prime Minister Blair embarked on two months of diplomacy to rally international support for military action; he held 54 meetings with world leaders. The U.S. set up the Guantanamo Bay detention camp to hold inmates they defined as "illegal enemy combatants". The legitimacy of these detentions has been questioned by the European Union and human rights organizations. On September 25, 2001, Iran's president Mohammad Khatami, meeting British Foreign Secretary Jack Straw, said: "Iran fully understands the feelings of the Americans about the terrorist attacks in New York and Washington on September 11". He said although the American administrations had been at best indifferent about terrorist operations in Iran, the Iranians felt differently and had expressed their sympathetic feelings with bereaved Americans in the tragic incidents in the two cities. He also stated that "Nations should not be punished in place of terrorists". According to Radio Farda's website, when the news of the attacks was released, some Iranian citizens gathered in front of the Embassy of Switzerland in Tehran, which serves as the protecting power of the United States in Iran, to express their sympathy, and some of them lit candles as a symbol of mourning. Radio Farda's website also states that in 2011, on the anniversary of the attacks, the United States Department of State published a post on its blog, in which the Department thanked the Iranian people for their sympathy and stated that it would never forget Iranian people's kindness. After the attacks, both the President and the Supreme Leader of Iran condemned the attacks. The BBC and Time magazine published reports on holding candlelit vigils for the victims by Iranian citizens on their websites. According to Politico Magazine, following the attacks, Ali Khamenei, the Supreme Leader of Iran, "suspended the usual 'Death to America' chants at Friday prayers" temporarily. At 2:40 pm on September 11, Secretary of Defense Donald Rumsfeld was issuing orders to his aides to look for evidence of Iraqi involvement. According to notes taken by senior policy official Stephen Cambone, Rumsfeld asked for, "Best info fast. Judge whether they are good enough to hit S.H. at the same time. Not only OBL". In a meeting at Camp David on September 15 the Bush administration rejected the idea of attacking Iraq in response to the September 11 attacks. Nonetheless, they later invaded the country with allies, citing "Saddam Hussein's support for terrorism". At the time, as many as seven in ten Americans believed the Iraqi president played a role in the 9/11 attacks. Three years later, Bush conceded that he had not. The NATO council declared that the terrorist attacks on the United States were an attack on all NATO nations that satisfied Article 5 of the NATO charter. This marked the first invocation of Article 5, which had been written during the Cold War with an attack by the Soviet Union in mind. Australian Prime Minister John Howard, who was in Washington, D.C., during the attacks, invoked Article IV of the ANZUS treaty. The Bush administration announced a war on terror, with the stated goals of bringing bin Laden and al-Qaeda to justice and preventing the emergence of other terrorist networks. These goals would be accomplished by imposing economic and military sanctions against states harboring terrorists, and increasing global surveillance and intelligence sharing. On September 14, 2001, the U.S. Congress passed the Authorization for the use of Military Force Against Terrorists, which grants the President the authority to use all "necessary and appropriate force" against those whom he determined "planned, authorized, committed or aided" the September 11 attacks or who harbored said persons or groups. It is still in effect. On October 7, 2001, the war in Afghanistan began when U.S. and British forces initiated aerial bombing campaigns targeting Taliban and al-Qaeda camps, then later invaded Afghanistan with ground troops of the Special Forces. This eventually led to the overthrow of the Taliban's rule of Afghanistan with the Fall of Kandahar on December 7, by U.S.-led coalition forces. Al-Qaeda leader Osama bin Laden, who went into hiding in the White Mountains, was targeted by U.S. coalition forces in the Battle of Tora Bora, but he escaped across the Pakistani border and remained out of sight for almost ten years. In an interview with Tayseer Allouni on October 21, 2001, bin Laden stated: The events proved the extent of terrorism that America exercises in the world. Bush stated that the world has to be divided in two: Bush and his supporters, and any country that doesn't get into the global crusade is with the terrorists. What terrorism is clearer than this? Many governments were forced to support this "new terrorism"... America wouldn't live in security until we live it truly in Palestine. This showed the reality of America, which puts Israel's interest above its own people's interest. America won't get out of this crisis until it gets out of the Arabian Peninsula, and until it stops its support of Israel. Aftermath Hundreds of thousands of tons of toxic debris containing more than 2,500 contaminants and known carcinogens were spread across Lower Manhattan when the towers collapsed. Exposure to the toxins in the debris is alleged to have contributed to fatal or debilitating illnesses among people who were at Ground Zero. The Bush administration ordered the Environmental Protection Agency (EPA) to issue reassuring statements regarding air quality in the aftermath of the attacks, citing national security, but the EPA did not determine that air quality had returned to pre–September 11 levels until June 2002. Health effects extended to residents, students, and office workers of Lower Manhattan and nearby Chinatown. Several deaths have been linked to the toxic dust, and victims' names were included in the World Trade Center memorial. An estimated 18,000 people have developed illnesses as a result of the toxic dust. There is also scientific speculation that exposure to toxic products in the air may have negative effects on fetal development. A study of rescue workers released in April 2010 found that all those studied had impaired lung function. Years after the attacks, legal disputes over the costs of related illnesses were still in the court system. In 2006, a federal judge rejected New York City's refusal to pay for health costs for rescue workers, allowing for the possibility of suits against the city. Government officials have been faulted for urging the public to return to lower Manhattan in the weeks shortly after the attacks. Christine Todd Whitman, administrator of the EPA in the attacks' aftermath, was heavily criticized by a U.S. District Judge for incorrectly saying that the area was environmentally safe. Mayor Giuliani was criticized for urging financial industry personnel to return quickly to the greater Wall Street area. The James L. Zadroga 9/11 Health and Compensation Act (2010) allocated $4.2 billion to create the World Trade Center Health Program, which provides testing and treatment for people with long-term health problems related to the 9/11 attacks. The WTC Health Program replaced preexisting 9/11-related health programs such as the Medical Monitoring and Treatment Program and the WTC Environmental Health Center program. In 2020, the NYPD confirmed that 247 NYPD police officers had died due to 9/11-related illnesses. In September 2022, the FDNY confirmed that 299 firefighters had died due to 9/11-related illnesses. Both agencies believe that the death toll will rise dramatically in the coming years. The Port Authority of New York and New Jersey Police Department (PAPD), the law enforcement agency with jurisdiction over the World Trade Center, confirmed that four of its police officers have died of 9/11-related illnesses. The chief of the PAPD at the time, Joseph Morris, made sure that industrial-grade respirators were provided to all PAPD police officers within 48 hours and decided that the same 30 to 40 police officers would be stationed at the World Trade Center pile, drastically lowering the number of total PAPD personnel who would be exposed to the air. The FDNY and NYPD had rotated hundreds, if not thousands, of different personnel from all over New York City to the pile without adequate respirators and breathing equipment that could have prevented future diseases. The attacks had a significant economic impact on the U.S. and world markets. The stock exchanges did not open on September 11 and remained closed until September 17. Reopening, the Dow Jones Industrial Average (DJIA) fell 684 points, or 7.1%, to 8921, a record-setting one-day point decline. By the end of the week, the DJIA had fallen 1,369.7 points (14.3%), at the time its largest one-week point drop in history. In 2001 dollars, U.S. stocks lost US$1.4 trillion in valuation for the week. In New York City, about 430,000 job months and US$2.8 billion in wages were lost in the first three months after the attacks. The economic effects were mainly on the economy's export sectors. The city's GDP was estimated to have declined by US$27.3 billion for the last three months of 2001 and all of 2002. The U.S. government provided US$11.2 billion in immediate assistance to the Government of New York City in September 2001, and US$10.5 billion in early 2002 for economic development and infrastructure needs. Also hurt were small businesses in Lower Manhattan near the World Trade Center (18,000 of which were destroyed or displaced), resulting in lost jobs and wages. Assistance was provided by Small Business Administration loans; federal government Community Development Block Grants; and Economic Injury Disaster Loans. Some 31,900,000 square feet (2,960,000 m2) of Lower Manhattan office space was damaged or destroyed. Many wondered whether these jobs would return, and if the damaged tax base would recover. Studies of 9/11's economic effects show the Manhattan office real-estate market and office employment were less affected than first feared, because of the financial services industry's need for face-to-face interaction. North American air space was closed for several days after the attacks and air travel decreased upon its reopening, leading to a nearly 20% cutback in air travel capacity, and exacerbating financial problems in the struggling U.S. airline industry. The September 11 attacks also led to the U.S. wars in Afghanistan and Iraq, as well as additional homeland security spending, totaling at least US$5 trillion. If Americans are clamouring to bomb Afghanistan back to the Stone Age, they ought to know that this nation does not have so far to go. This is a post-apocalyptic place of felled cities, parched land and downtrodden people. Most of the Afghan population was already going hungry at the time of the attacks. In the aftermath of the attacks, tens of thousands of people attempted to flee Afghanistan due to the possibility of military retaliation by the U.S. Pakistan, already home to many Afghan refugees from previous conflicts, closed its border with Afghanistan on September 17, 2001. Thousands of Afghans also fled to the frontier with Tajikistan but were denied entry. The Taliban leaders in Afghanistan pleaded against military action, saying "We appeal to the United States not to put Afghanistan into more misery because our people have suffered so much", referring to two decades of conflict and the humanitarian crisis attached to it. All United Nations expatriates had left Afghanistan after the attacks and no national or international aid workers were at their post. Workers were instead preparing in bordering countries like Pakistan, China and Uzbekistan to prevent a potential "humanitarian catastrophe", amid a critically low food stock for the Afghan population. The World Food Programme stopped importing wheat to Afghanistan on September 12 due to security risks. Approximately one month after the attacks, the United States led a broad coalition of international forces to overthrow the Taliban regime from Afghanistan for their harboring of al-Qaeda. Though Pakistani authorities were initially reluctant to align themselves with the U.S. against the Taliban, they permitted the coalition access to their military bases, and arrested and handed over to the U.S. over 600 suspected al-Qaeda members. In 2011, the U.S. and NATO under President Obama initiated a drawdown of troops in Afghanistan finalized in 2016. During the presidencies of Donald Trump and Joe Biden in 2020 and 2021, the United States alongside its NATO allies withdrew all troops from Afghanistan, completing the withdrawal of all regular U.S. troops on August 30, 2021. The withdrawal marked the end of the 2001–2021 war in Afghanistan. Biden said that after nearly 20 years of war, it was clear that the U.S. military could not transform Afghanistan into a modern democracy. Immediate responses to 9/11 included greater focus on home life and time spent with family, higher church attendance, and increased expressions of patriotism such as the flying of American flags. The radio industry responded by removing certain songs from playlists, and the attacks have subsequently been used as background, narrative, or thematic elements in film, music, literature, and humour. Already-running television shows as well as programs developed after 9/11 have reflected post-9/11 cultural concerns. 9/11 conspiracy theories have become a social phenomenon, despite a lack of support from expert scientists, engineers, and historians. 9/11 has also had a major impact on the religious faith of many individuals; for some it strengthened, to find consolation to cope with the loss of loved ones and overcome their grief; others started to question their faith or lose it entirely because they could not reconcile it with their view of religion. The culture of America, after the attacks, is noted for heightened security and an increased demand thereof, as well as paranoia and anxiety regarding future terrorist attacks against most of the nation. Psychologists have also confirmed that there has been an increased amount of national anxiety in commercial air travel. Anti-Muslim hate crimes rose nearly ten-fold in 2001 and have subsequently remained "roughly five times higher than the pre-9/11 rate". The September 11 attacks introduced foreign terrorism as a major security issue to the US, as they indicated smaller states and terrorist organizations had become increasingly capable even against major global powers. Many governments across the world passed legislation to combat terrorism as a result to the attacks. In Germany, where several of the 9/11 terrorists had resided and taken advantage of that country's liberal asylum policies, two major anti-terrorism packages were enacted. The first removed legal loopholes that permitted terrorists to live and raise money in Germany. The second addressed the effectiveness and communication of intelligence and law enforcement. Canada passed the Canadian Anti-Terrorism Act, their first anti-terrorism law. The United Kingdom passed the Anti-terrorism, Crime and Security Act 2001 and the Prevention of Terrorism Act 2005. New Zealand enacted the Terrorism Suppression Act 2002. In the United States, the Department of Homeland Security was created by the Homeland Security Act of 2002 to coordinate domestic anti-terrorism efforts. The USA Patriot Act gave the federal government greater powers, including the authority to detain foreign terror suspects for a week without charge; to monitor terror suspects' telephone communications, e-mail, and Internet use; and to prosecute suspected terrorists without time restrictions. The FAA ordered that airplane cockpits be reinforced with a secondary flight deck to prevent terrorists from gaining control of planes and assigned sky marshals to flights. Further, the Aviation and Transportation Security Act made the federal government, rather than airports, responsible for airport security. The law created the Transportation Security Administration to inspect passengers and luggage, causing long delays and concern over passenger privacy. After suspected abuses of the USA Patriot Act were brought to light in June 2013 with articles about the collection of American call records by the NSA and the PRISM program, Representative Jim Sensenbrenner (of Wisconsin), who introduced the Patriot Act in 2001, said that the NSA overstepped its bounds. Criticism of the war on terror has focused on its morality, efficiency, and cost. According to a 2021 report by the Costs of War Project, the several post-9/11 wars participated in by the United States in its war on terror have caused the displacement, conservatively calculated, of 38 million people in Afghanistan, Pakistan, Iraq, Libya, Syria, Yemen, Somalia, and the Philippines. They estimated these wars caused the deaths of 897,000 to 929,000 people directly and cost US$8 trillion. In a 2023 report, the Costs of War Project estimated that there have been between 3.6 and 3.7 million indirect deaths in the post-9/11 war zones, with the total death toll being 4.5 to 4.6 million. The report defined post-9/11 war zones as conflicts that included significant United States counter-terrorism operations since 9/11, which in addition to the wars in Iraq, Afghanistan and Pakistan, also includes the civil wars in Syria, Yemen, Libya and Somalia. The report derived its estimate of indirect deaths using a calculation from the Geneva Declaration of Secretariat which estimates that for every person directly killed by war, four more die from the indirect consequences of war. The U.S. Constitution and U.S. law prohibits the use of torture, yet such human rights violations occurred during the war on terror under the euphemism "enhanced interrogation". In 2005, The Washington Post and Human Rights Watch (HRW) published revelations concerning CIA flights and "black sites", covert prisons operated by the CIA. The term "torture by proxy" is used by some critics to describe situations in which the CIA and other U.S. agencies have transferred suspected terrorists to countries known to employ torture. As all 19 hijackers died in the attacks, they were never prosecuted. Osama bin Laden was never formally indicted; he was ultimately killed by U.S. special forces on May 2, 2011, in his compound in Abbottabad, Pakistan, after a 10-year manhunt.[r] The main trial of the attacks against Mohammed and his co-conspirators Walid bin Attash, Ramzi bin al-Shibh, Ammar al-Baluchi, and Mustafa Ahmad al-Hawsawi remains unresolved. Khalid Sheikh Mohammed was arrested on March 1, 2003, in Rawalpindi, Pakistan, by Pakistani security officials working with the CIA. He was then held at multiple CIA secret prisons and Guantanamo Bay detention camp, where he was interrogated and tortured with methods including waterboarding. In 2003, al-Hawsawi and Abd al-Aziz Ali were arrested and transferred to U.S. custody. Both would later be accused of providing money and travel assistance to the hijackers. During U.S. hearings at Guantanamo Bay in March 2007, Mohammed again confessed his responsibility for the attacks, stating he "was responsible for the 9/11 operation from A to Z" and that his statement was not made under duress. In January 2023, the U.S. government opened up about a potential plea deal, with Biden giving up on the effort in September that year. To date, only peripheral persons have thus been convicted for charges in connection with the attacks. These include: In July 2024, The New York Times reported that Mohammed, bin Attash, and al-Hawsawi had agreed to plead guilty to conspiracy in exchange for life sentences, avoiding trial and execution. However, U.S. Defense Secretary Lloyd Austin revoked a plea agreement with Mohammed days later. Investigations Immediately after the attacks, the Federal Bureau of Investigation (FBI) started PENTTBOM, the largest criminal inquiry in U.S. history. At its height, more than half of the FBI's agents worked on the investigation and followed a half-million leads. The FBI concluded that there was "clear and irrefutable" evidence linking al-Qaeda and bin Laden to the attacks. The FBI quickly identified the hijackers, including leader Mohamed Atta, when his luggage was discovered at Boston's Logan Airport. Atta had been forced to check two of his three bags due to space limitations on the 19-seat commuter flight he took to Boston. Due to a new policy instituted to prevent flight delays, the luggage failed to make it aboard American Airlines Flight 11 as planned. The luggage contained the hijackers' names, assignments, and al-Qaeda connections. "It had all these Arab-language [sic] papers that amounted to the Rosetta stone of the investigation," said one FBI agent. Within hours of the attacks, the FBI released the names and in many cases the personal details of the suspected pilots and hijackers. Abu Jandal, who served as bin Laden's chief bodyguard for years, confirmed the identity of seven hijackers as al-Qaeda members during interrogations with the FBI on September 17. He had been jailed in a Yemeni prison since 2000. On September 27, photos of all 19 hijackers were released, along with information about possible nationalities and aliases. Fifteen of the men were from Saudi Arabia, two were from the United Arab Emirates, one was from Egypt, and one was from Lebanon. By midday, the U.S. National Security Agency and German intelligence agencies had intercepted communications pointing to Osama bin Laden. Two of the hijackers were known to have traveled with a bin Laden associate to Malaysia in 2000 and hijacker Mohamed Atta had previously gone to Afghanistan. He and others were part of a terrorist cell in Hamburg, Germany. One of the members of the Hamburg cell in Germany was discovered to have been in communication with Khalid Sheikh Mohammed who was identified as a member of al-Qaeda. Authorities in the United States and the United Kingdom also obtained electronic intercepts, including telephone conversations and electronic bank transfers, which indicated that Mohammed Atef was a key figure in the planning of the 9/11 attacks. Intercepts were also obtained of conversations that took place days before September 11 between bin Laden and an associate in Pakistan referring to "an incident that would take place in America on, or around, September 11" and discussing potential repercussions. In another conversation with an associate in Afghanistan, bin Laden discussed the "scale and effects of a forthcoming operation." These conversations did not specifically mention the World Trade Center, the Pentagon, or other specifics. In their annual violent crime index for the year 2001, the FBI recorded the deaths from the attacks as murder, in separate tables so as not to mix them with other reported crimes for that year. In a disclaimer, the FBI stated that "the number of deaths is so great that combining it with the traditional crime statistics will have an outlier effect that falsely skews all types of measurements in the program's analyses." New York City also did not include the deaths in their annual crime statistics for 2001. In 2004, John L. Helgerson, the Inspector General of the Central Intelligence Agency (CIA), conducted an internal review of the agency's pre-9/11 performance and was harshly critical of senior CIA officials for not doing everything possible to confront terrorism. According to Philip Giraldi in The American Conservative, Helgerson criticized their failure to stop two of the 9/11 hijackers, Nawaf al-Hazmi and Khalid al-Mihdhar, as they entered the United States and their failure to share information on the two men with the FBI. In May 2007, senators from both major U.S. political parties (the Republican and Democratic parties) drafted legislation to make the review public. One of the backers, Senator Ron Wyden said, "The American people have a right to know what the Central Intelligence Agency was doing in those critical months before 9/11". The report was released in 2009 by President Barack Obama. In February 2002, the Senate Select Committee on Intelligence and the House Permanent Select Committee on Intelligence formed a joint inquiry into the performance of the U.S. Intelligence Community. Their 832-page report released in December 2002 detailed failings of the FBI and CIA to use available information, including about terrorists the CIA knew were in the United States, to disrupt the plots. The joint inquiry developed its information about possible involvement of Saudi Arabian government officials from non-classified sources. The Bush administration demanded 28 related pages remain classified. In December 2002, the inquiry's chair Bob Graham revealed in an interview that there was "evidence that there were foreign governments involved in facilitating the activities of at least some of the terrorists in the United States". Victim families were frustrated by the unanswered questions and redacted material from the congressional inquiry and demanded an independent commission. September 11 victim families, members of Congress and the Saudi Arabian government are still seeking the release of the documents. In June 2016, CIA chief John Brennan said that he believes 28 redacted pages of a congressional inquiry into 9/11 will soon be made public, and that they will prove that the government of Saudi Arabia had no involvement in the September 11 attacks. In September 2016, Congress passed the Justice Against Sponsors of Terrorism Act that would allow relatives of victims of the September 11 attacks to sue Saudi Arabia for its government's alleged role in the attacks. The National Commission on Terrorist Attacks Upon the United States, popularly known as the 9/11 Commission, chaired by Thomas Kean,[s] was formed in late 2002 to prepare a thorough account of the circumstances surrounding the attacks, including preparedness for and the immediate response to the attacks. The commission issued the 9/11 Commission Report in July 2004, a 585-page report based on its investigations. The report detailed the events leading up to the attacks, concluding that they were carried out by al-Qaeda. The commission also examined how security and intelligence agencies were inadequately coordinated to prevent the attacks. According to the report, "We believe the 9/11 attacks revealed four kinds of failures: in imagination, policy, capabilities, and management." The commission made numerous recommendations on how to prevent future attacks, and in 2011 was dismayed that several of its recommendations had yet to be implemented. The U.S. National Institute of Standards and Technology investigated the collapses of the Twin Towers and 7 WTC. The investigations examined why the buildings collapsed and what fire protection measures were in place, and evaluated how fire protection systems might be improved in future construction. The investigation into the collapse of 1 WTC and 2 WTC was concluded in October 2005 and that of 7 WTC was completed in August 2008. NIST found that the fireproofing on the Twin Towers' steel infrastructures was blown off by the initial impact of the planes and that had this not occurred, the towers likely would have remained standing. A 2007 study of the north towers' collapse published by researchers of Purdue University determined that since the plane's impact had stripped off much of the structure's thermal insulation, the heat from a typical office fire would have softened and weakened the exposed girders and columns enough to initiate the collapse regardless of the number of columns cut or damaged by the impact. The director of the original investigation stated that "the towers did amazingly well. The terrorist aircraft didn't bring the buildings down; it was the fire that followed. It was proven that you could take out two-thirds of the columns in a tower and the building would still stand." The fires weakened the trusses supporting the floors, making the floors sag. The sagging floors pulled on the exterior steel columns causing the exterior columns to bow inward. With the damage to the core columns, the buckling exterior columns could no longer support the buildings, causing them to collapse. Additionally, the report found the towers' stairwells were not properly reinforced to provide adequate emergency escape for people above the impact zones. NIST concluded that uncontrolled fires in 7 WTC caused floor beams and girders to heat and subsequently "caused a critical support column to fail, initiating a fire-induced progressive collapse that brought the building down." In July 2016, the Obama administration released a document compiled by U.S. investigators Dana Lesemann and Michael Jacobson, known as "File 17," which contains a list naming three dozen people, including the suspected Saudi intelligence officers attached to Saudi Arabia's embassy in Washington, D.C., which connects Saudi Arabia to the hijackers. In September 2016, Congress passed the Justice Against Sponsors of Terrorism Act. The practical effect of the legislation was to allow the continuation of a longstanding civil lawsuit brought by families of victims of the September 11 attacks against Saudi Arabia for its government's alleged role in the attacks. In March 2018, a U.S. judge formally allowed a suit to move forward against the government of Saudi Arabia brought by 9/11 survivors and victims' families. In 2022, the families of some 9/11 victims obtained two videos and a notepad seized from Saudi national Omar al-Bayoumi by the British courts. The first video showed him hosting a party in San Diego for Nawaf al-Hazmi and Khalid al-Mihdhar, the first two hijackers to arrive in the U.S. The other video showed al-Bayoumi greeting the cleric Anwar al-Awlaki, who was blamed for radicalizing Americans and later killed in a CIA drone strike. The notepad depicted a hand-drawn airplane and some mathematical equations that, according to a pilot's court statement, might have been used to calculate the rate of descent to get to a target. According to a 2017 FBI memo, from the late 1990s until the 9/11 attack, al-Bayoumi was a paid cooptee of the Saudi General Intelligence Presidency. As of April 2022[update] he is believed to be living in Saudi Arabia, which has denied any involvement in 9/11. Rebuilding and memorials On the day of the attacks, New York City mayor Rudy Giuliani stated: "We will rebuild. We're going to come out of this stronger than before, politically stronger, economically stronger. The skyline will be made whole again". Within hours of the attack, a substantial search and rescue operation was launched. After months of around-the-clock operations, the World Trade Center site was cleared by the end of May 2002. The damaged section of the Pentagon was rebuilt and occupied within a year of the attacks. The temporary World Trade Center PATH station opened in late 2003 and construction of the new 7 World Trade Center was completed in 2006. Work on rebuilding the main World Trade Center site was delayed until late 2006 when leaseholder Larry Silverstein and the Port Authority of New York and New Jersey agreed on financing. The construction of One World Trade Center began in April 2006, and reached its full height in May 2013. The spire was installed atop the building at that date, putting One WTC's height at 1,776 feet (541 m) and thus claiming the title of the tallest building in the Western Hemisphere. One WTC finished construction and opened on November 3, 2014. On the World Trade Center site, three more office towers were to be built one block east of where the original towers stood. 4 WTC, meanwhile, opened in November 2013, making it the second tower on the site to open behind 7 World Trade Center, as well as the first building on the Port Authority property. 3 WTC opened in June 2018, becoming the fourth skyscraper at the site to be completed. In December 2022, the Nicholas Greek Orthodox Church fully reopened for regular services followed by the opening of the Ronald O. Perelman Performing Arts Center in September 2023. With construction beginning in 2008, 2 World Trade Center remains as of 2025 unfinished. Scale models of the building were publicly revealed in September 2024, although Silverstein Properties was still trying to secure funding for the tower at the time. In the days immediately following the attacks, many memorials and vigils were held around the world, and photographs of the dead and missing were posted around Ground Zero. A witness described being unable to "get away from faces of innocent victims who were killed. Their pictures are everywhere, on phone booths, street lights, and walls of subway stations. Everything reminded me of a huge funeral, people were quiet and sad, but also very nice. Before, New York gave me a cold feeling; now people were reaching out to help each other". President Bush proclaimed Friday, September 14, 2001, as Patriot Day. One of the first memorials was the Tribute in Light, an installation of 88 searchlights at the footprints of the World Trade Center towers. In New York City, the World Trade Center Site Memorial Competition was held to design an appropriate memorial on the site. The winning design, Reflecting Absence, was selected in August 2006, and consists of a pair of reflecting pools in the footprints of the towers, surrounded by a list of the victims' names in an underground memorial space. The memorial was completed on the 10th anniversary of the attacks in 2011; a museum also opened on site in May 2014. The Sphere by the German sculptor Fritz Koenig is the world's largest bronze sculpture of modern times, and stood between the Twin Towers on the Austin J. Tobin Plaza from 1971 until the attacks. The sculpture, weighing more than 20 tons, was the only remaining work of art to be recovered largely intact from the ruins of the towers. Since then, the work of art, known in the U.S. as The Sphere, has been transformed into a symbolic monument of 9/11 commemoration. After being dismantled and stored near a hangar at John F. Kennedy International Airport, the sculpture was the subject of the 2001 documentary The Sphere by filmmaker Percy Adlon. In August 2017, the work was installed at Liberty Park, close to the new World Trade Center aerial and the 9/11 Memorial. In Arlington County, the Pentagon Memorial was completed and opened to the public on the seventh anniversary of the attacks in 2008. It consists of a landscaped park with 184 benches facing the Pentagon. When the Pentagon was repaired in 2001–2002, a private chapel and indoor memorial were included at the spot where Flight 77 crashed into the building. In Shanksville, a concrete-and-glass visitor center was opened in 2015, situated on a hill overlooking the crash site and the white marble Wall of Names. An observation platform at the visitor center and the white marble wall are both aligned beneath the path of Flight 93. New York City firefighters donated a cross made of steel from the World Trade Center and mounted on top of a platform shaped like the Pentagon. It was installed outside the firehouse on August 25, 2008. Many other permanent memorials are elsewhere. Scholarships and charities have been established by the victims' families and by many other organizations and private figures. On every anniversary in New York City, the names of the victims who died there are read out over music. The President of the United States attends a memorial service at the Pentagon, and asks Americans to observe Patriot Day with a moment of silence. Smaller services are held in Shanksville, Pennsylvania, which are usually attended by the First Lady. In 2023, Joe Biden did not attend services in the affected areas, instead marking the day in Anchorage, Alaska, the only U.S. president to do so since the attacks. See also References Bibliography Further reading External links Multimedia |
======================================== |
[SOURCE: https://github.com/features] | [TOKENS: 1470] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. The tools you need to build what you want Experience AI with Copilot Chat The latest GitHub previews Collaborative Coding Innovate faster with seamless collaboration. Spin up fully configured dev environments in the cloud with the full power of your favorite editor. Get suggestions for whole lines of code or entire functions right inside your editor. Receive notifications of contributor changes to a repository, with specified access limits, and seamlessly merge accepted updates. Dedicated space for your community to come together, ask and answer questions, and have open-ended conversations. Rapidly search, navigate, and understand code right from GitHub.com with our powerful new tools. Review new code, visualize changes, and merge confidently with automated status checks. Collaborate and discuss changes without a formal review or the risk of unwanted merges. Enforce branch merge restrictions by requiring reviews or limiting access to specific contributors. Automation and CI/CD Automate everything: CI/CD, testing, planning, project management, issue labeling, approvals, onboarding, and more Automate your software workflows by writing tasks and combining them to build, test, and deploy faster from GitHub. Host your own software packages or use them as dependencies in other projects, with both private and public hosting available. Create calls to get all the data and events you need within GitHub, and automatically kick off and advance your software workflows. Leverage thousands of actions and applications from our community to help build, improve, and accelerate your workflows. Dozens of events and a webhooks API help you integrate with and automate work for your repository, organization, or application. Move automation to the cloud with on-demand Linux, macOS, Windows, ARM, and GPU environments for your workflow runs, all hosted by GitHub. Gain more environments and fuller control with labels, groups, and policies to manage runs on your own machines, plus an open source runner application. Map workflows, track their progression in real time, understand complex workflows, and communicate status with the rest of the team. Standardize and scale best practices and processes with preconfigured workflow templates shared across your organization. Application security Application security where found means fixed. Powered by GitHub Copilot Autofix. Find vulnerabilities in your code with CodeQL, GitHub’s industry-leading semantic code analysis. Prevent new vulnerabilities from being introduced by scanning every pull request. Powered by GitHub Copilot, generate automatic fixes for 90% of alert types in JavaScript, TypeScript, Java, and Python. Quickly remediate with contextual vulnerability intelligence and advice. Solve your backlog of application security debt with security campaigns that target and generate autofixes for up to 1,000 alerts at a time, rapidly reducing the risk of vulnerabilities and zero-day attacks. Detect exposed secrets in your public and private repositories, and revoke them to secure access to your services. Additional AI capabilities to detect elusive secrets like passwords. View the packages your project relies on, the repositories that depend on them, and any vulnerabilities detected in their dependencies. Receive alerts when new vulnerabilities affect your repositories, with GitHub detecting and notifying you of vulnerable dependencies in both public and private repositories. Keep your code secure by automatically opening pull requests that update vulnerable or out-of-date dependencies. Assess the security impact of new dependencies in pull requests before merging. Privately report, discuss, fix, and publish information about security vulnerabilities found in open source repositories. Enable your public repository to privately receive vulnerability reports from the community and collaborate on solutions. Browse or search GitHub's database of known vulnerabilities, featuring curated CVEs and security advisories linked to the GitHub dependency graph. Client apps Access GitHub anywhere: On Desktop, Mobile, and Command Line. Take your projects, ideas, and code to go with fully native mobile and tablet experiences. Manage issues and pull requests from the terminal, where you're already working with Git and your code. Simplify your development workflow with a GUI to visualize, commit, and push changes—no command line needed. Project management Keep feature requests, bugs, and more organized. Create a customized view of your issues and pull requests to plan and track your work. Track bugs, enhancements, and other requests, prioritize work, and communicate with stakeholders as changes are proposed and merged. Track progress on groups of issues or pull requests in a repository, and map groups to overall project goals. Leverage insights to visualize your projects by creating and sharing charts built from your project's data. View vulnerabilities, licenses, and other important information for the open source projects your organization depends on. Use data about activity, trends, and contributions within your repositories, to make data-driven improvements to your development cycle. Host project documentation in a wiki within your repository, allowing contributors to easily edit it on the web or locally. Governance & administration Simplify access and permissions management across your projects and teams. Create groups of user accounts that own repositories and manage access on a team-by-team or individual user basis. Organize your members to mirror your company's structure, with cascading access to permissions and mentions. Enable team synchronization between your identity provider and your organization on GitHub, including Entra ID and Okta. Define users' access level to your code, data, and settings based on their role in your organization. Ensure members have only the permissions they need by creating custom roles with fine-grained permission settings. Verify your organization's identity on GitHub and display that verification through a profile badge. Take care of your security assessment and certification needs by accessing GitHub’s cloud compliance reports, such as our SOC reports and Cloud Security Alliance CAIQ self-assessments (CSA CAIQ). Quickly review the actions performed by members of your organization. Monitor access, permission changes, user changes, and other events. Enhance your organization's security with scalable source code protections, and use rule insights to easily review how and why code changes occurred in your repositories. Enable collaboration between your organization and GitHub environments with a single point of visibility and management via an enterprise account. Share features and workflows between your GitHub Enterprise Server instance and GitHub Enterprise Cloud. Securely control access to organization resources like repositories, issues, and pull requests with SAML, while allowing users to authenticate with their GitHub usernames. Centralize repository management. LDAP is one of the most common protocols used to integrate third-party software with large company user directories. Manage the lifecycle and authentication of users on GitHub Enterprise Cloud from your identity provider (IdP). Use the SSO and SCIM providers of your choice for Enterprise Managed Users, separate from one another, for a more flexible approach to user lifecycle management. Community Financially support the open source projects your code depends on. Sponsor a contributor, maintainer, or project with one time or recurring contributions. Learn new skills by completing tasks and projects directly within GitHub, guided by our friendly bot. Write cross-platform desktop applications using JavaScript, HTML, and CSS with the Electron framework, based on Node.js and Chromium. GitHub Education is a commitment to bringing tech and open source collaboration to students and educators across the globe. Explore all the plans to find the solution that fits your needs. Site-wide Links Get tips, technical guides, and best practices. Twice a month. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Soviet_Union] | [TOKENS: 18248] |
Contents Soviet Union The Union of Soviet Socialist Republics[o][p] (USSR),[q] also known as the Soviet Union,[r] was a transcontinental country that spanned much of Eurasia from 1922 until its dissolution in 1991. It was the world's third-most populous country, largest by area, and bordered twelve countries.[s] A diverse multinational state, it was organized as a federal union of national republics, the largest and most populous being the Russian Soviet Federative Socialist Republic.[t] In practice, its government and economy were highly centralized. As a one-party state governed by its Communist Party, it was the flagship communist state. Its capital and largest city was Moscow. The Soviet Union's roots lay in the October Revolution of 1917. The new government, led by Vladimir Lenin, established the Russian SFSR, the world's first constitutionally communist state. Following the Bolshevik victory in the Russian Civil War, the Russian SFSR and its subordinate republics were merged into the Soviet Union in 1922. Following Lenin's death in 1924, Joseph Stalin came to power, inaugurating rapid industrialization and forced collectivization that led to significant growth but contributed to a 1930s famine killing millions. Soviet forced labour expanded via the Gulag system. Stalin's government conducted the late 1930s Great Purge via deportations, executions, and show trials. Failing to build an anti-Nazi coalition in Europe, the Soviet Union signed a 1939 non-aggression pact with Nazi Germany. Nonetheless, in 1941 Germany invaded the Soviet Union in the largest land invasion in history, opening the Eastern Front of World War II. The Red Army played a decisive role in the Allies defeating the Axis powers, while liberating much of Central and Eastern Europe. At around 27 million casualties, the country suffered the most deaths in World War II. In the war's aftermath, the Soviet Union consolidated territories it occupied into satellite states, and undertook rapid economic development, cementing its status as a superpower. Geopolitical tensions with the United States led to the Cold War. The US-led Western Bloc coalesced into the NATO military alliance in 1949, prompting the Eastern Bloc to form the Warsaw Pact in 1955. With scant direct combat, the blocs fought via ideological and proxy wars. In 1953, following Stalin's death, Nikita Khrushchev led a campaign of de-Stalinization. Resulting ideological tensions with communist China, led by Mao Zedong, culminated in an acrimonious split. In the following fifteen years the Soviet military suppressed uprisings in East Germany, Hungary and Czechoslovakia, while resolution of the Cuban Missile Crisis narrowly averted a global conflict. Under the 18-year rule of Leonid Brezhnev, prosperity turned toward stagnation and corruption, while US relations eased. In 1985, Mikhail Gorbachev sought reform through his policies glasnost and perestroika. In 1989, most Warsaw Pact countries overthrew their Soviet-backed regimes, ending the Eastern Bloc. Nationalist movements across the Soviet republics declared sovereignty. In 1991, after a successful referendum to establish a renewed federation, a failed coup by hardliners prompted Russia, Ukraine, and Belarus to secede. On 26 December, Gorbachev officially recognized the dissolution of the Soviet Union. Boris Yeltsin, leader of the Russian SFSR, oversaw its reconstitution into the Russian Federation, the Soviet Union's successor state; the fourteen other republics emerged as fully independent states. All except the Baltics joined the Commonwealth of Independent States. The post-Soviet states experienced a humanitarian disaster, and dozens of wars and conflicts. The Soviet Union was one of the world's two superpowers, with the largest standing military, the second-largest economy, a hegemony in Eastern Europe and Asia, global diplomacy, ideological influence (particularly in the Global South), and scientific and technological accomplishments. It wielded the world's largest arsenals of nuclear, chemical, and biological weapons. Its space program made extensive achievements in the Space Race including the first artificial satellite, and first human spaceflight. Soviet culture was influenced by the official socialist realism style and later underground samizdat publications. As a major Allied nation, it became one of the five permanent members of the United Nations Security Council. In some post-Soviet states including Russia, nostalgia remains high for the Soviet Union, while others view it negatively. Academics have variously criticized the Soviet system as authoritarian, bureaucratic, and state capitalist, while some have praised its industrialization, scientific capacity, and anti-imperialist influence globally. Etymology The word "soviet" is derived from the Russian word sovet (Russian: совет), meaning 'council', 'assembly', 'advice',[u] ultimately deriving from the proto-Slavic verbal stem of *vět-iti ('to inform'), related to Slavic věst ('news') and English wise. The word sovietnik means 'councillor'. Some organizations in Russian history were called council (Russian: совет). In the Russian Empire, the State Council, which functioned from 1810 to 1917, was referred to as a Council of Ministers. The soviets as workers' councils first appeared during the 1905 Russian Revolution. Although they were quickly suppressed by the Imperial army, after the February Revolution of 1917, workers' and soldiers' soviets emerged throughout the country and shared power with the Russian Provisional Government. The Bolsheviks, led by Vladimir Lenin, demanded that all power be transferred to the soviets, and gained support from the workers and soldiers. After the October Revolution, in which they seized power from the Provisional Government in the name of the soviets, Lenin proclaimed the formation of the Russian Socialist Federal Soviet Republic (RSFSR). During the Georgian Affair of 1922, Lenin called for the Russian SFSR and other national soviet republics to form a greater union which he initially named as the Union of Soviet Republics of Europe and Asia (Russian: Союз Советских Республик Европы и Азии, romanized: Soyuz Sovyetskikh Respublik Evropy i Azii). Joseph Stalin initially resisted Lenin's proposal but ultimately accepted it, and with Lenin's agreement he changed the name to the Union of Soviet Socialist Republics (USSR), although all republics began as socialist soviet and did not change to the other order until 1936. In addition, in the regional languages of several republics, the word council or conciliar in the respective language was only quite late changed to an adaptation of the Russian soviet and never in others, e.g. Ukrainian SSR. СССР (in the Latin alphabet: SSSR) is the abbreviation of the Russian-language cognate of USSR, as written in Cyrillic letters. The soviets used this abbreviation so frequently that audiences worldwide became familiar with its meaning. After this, the most common Russian initialization is Союз ССР (transliteration: Soyuz SSR) which essentially translates to Union of SSRs in English. In addition, the Russian short form name Советский Союз (transliteration: Sovyetsky Soyuz, which literally means Soviet Union) is also commonly used, but only in its unabbreviated form. Since the start of the Great Patriotic War at the latest, abbreviating the Russian name of the Soviet Union as СС has been taboo, the reason being that СС as a Russian Cyrillic abbreviation is associated with the infamous Schutzstaffel of Nazi Germany, as SS is in English. In English-language media, the state was referred to as the Soviet Union or the USSR. The Russian SFSR dominated the Soviet Union to such an extent that, for most of the Soviet Union's existence, it was colloquially, but incorrectly, referred to as Russia. History The history of the Soviet Union began with the ideals of the Bolshevik Revolution and ended in dissolution amid economic collapse and political disintegration. Established in 1922 following the Russian Civil War, the Soviet Union became a one-party state under the Communist Party. Its early years under Lenin were marked by the implementation of socialist policies and the New Economic Policy (NEP), which allowed for market-oriented reforms. The rise of Joseph Stalin in the late 1920s ushered in an era of intense centralization and totalitarianism. Stalin's rule was characterized by the forced collectivization of agriculture, rapid industrialization, and the Great Purge, which eliminated perceived enemies of the state. The Soviet Union, one of the Big Four Allied powers alongside the United States, the United Kingdom, and China, played a crucial role in the Allied victory in World War II. It paid a tremendous human cost with millions of Soviet citizens dying in the conflict. The Soviet Union emerged as one of the world's two superpowers, leading the Eastern Bloc in opposition to the Western Bloc during the Cold War. This period saw the USSR engage in an arms race, the Space Race, and proxy wars around the globe. The post-Stalin leadership, particularly under Nikita Khrushchev, initiated a de-Stalinization process, leading to a period of liberalization and relative openness known as the Khrushchev Thaw. However, the subsequent era under Leonid Brezhnev, sometimes referred to as the Era of Stagnation, was marked by economic decline, political corruption, and a rigid gerontocracy. Despite efforts to maintain the Soviet Union's superpower status, the economy struggled due to its centralized nature, technological backwardness, and inefficiencies. The vast military expenditures and burdens of maintaining the Eastern Bloc further strained the Soviet economy. In the 1980s, Mikhail Gorbachev's policies of glasnost (openness) and perestroika (restructuring) aimed to revitalize the Soviet system but instead accelerated its unraveling. Nationalist movements gained momentum across the Soviet republics and the control of the Communist Party weakened. The failed coup attempt in August 1991 against Gorbachev by hardline communists hastened the end of the Soviet Union, which formally dissolved on 26 December 1991, ending nearly seven decades of Soviet rule. Geography With an area of 22,402,200 square kilometres (8,649,500 sq mi), the Soviet Union was the world's largest country, a status that is retained by the Russian Federation. Covering a sixth of Earth's land surface, its size was comparable to that of North America. Two other successor states are also very large — Kazakhstan ranks among the top 10 countries by land area, and Ukraine is the largest country entirely in Europe. The European portion accounted for a quarter of the country's area and was the cultural and economic center. The eastern part in Asia extended to the Pacific Ocean to the east and Afghanistan to the south, and, except some areas in Central Asia, was much less populous. It spanned over 10,000 kilometres (6,200 mi) east to west across 11 time zones, and over 7,200 kilometres (4,500 mi) north to south. It had five climate zones: tundra, taiga, steppes, desert and mountains. The USSR, like Russia, had the world's longest border, measuring over 60,000 kilometres (37,000 mi), or 1+1⁄2 circumferences of Earth. Two-thirds of it was a coastline. The country bordered Afghanistan, the People's Republic of China, Czechoslovakia, Finland, Hungary, Iran, Mongolia, North Korea, Norway, Poland, Romania, and Turkey from 1945 to 1991. The Bering Strait separated the USSR from the United States, while the La Pérouse Strait separated it from Japan. The country's highest mountain was Communism Peak (now Ismoil Somoni Peak) in Tajikistan, at 7,495 metres (24,590 ft). The USSR also included most of the world's largest lakes; the Caspian Sea (shared with Iran), and Lake Baikal, the world's largest (by volume) and deepest freshwater lake that is also an internal body of water in Russia. Government and politics The Soviet communist state system was based on unified state power and democratic centralism. The highest organ of state authority, the Supreme Soviet of the Soviet Union, stood above all other state organs and worked under the leadership of the Communist Party of the Soviet Union. The executive organ of the state (synonymous with government), the Council of Ministers, was an internal organ of the All-Union Supreme Soviet. At the top of the Communist Party was the Central Committee, elected at Party Congresses and Conferences. In turn, the Central Committee voted for a Politburo (called the Presidium between 1952 and 1966), Secretariat and the general secretary (First Secretary from 1953 to 1966), the de facto highest office in the Soviet Union. Depending on the degree of power consolidation, it was either the Politburo as a collective body or the General Secretary, who always was one of the Politburo members, that effectively led the party and the country (except for the period of the highly personalized authority of Stalin, exercised directly through his position in the Council of Ministers rather than the Politburo after 1941). They were not controlled by the general party membership, as the key principle of the party organization was democratic centralism, demanding strict subordination to higher bodies, and elections went uncontested, endorsing the candidates proposed from above. The Communist Party maintained its dominance over the state mainly through its control over the system of appointments. All senior government officials and most deputies of the Supreme Soviet were members of the CPSU. Of the party heads themselves, Stalin (1941–1953) and Khrushchev (1958–1964) were Premiers. Upon the forced retirement of Khrushchev, the party leader was prohibited from this kind of double membership, but the later General Secretaries for at least some part of their tenure occupied the mostly ceremonial position of Chairman of the Presidium of the Supreme Soviet, the nominal head of state. The institutions at lower levels were overseen and at times supplanted by primary party organizations. However, in practice the degree of control the party was able to exercise over the state bureaucracy, particularly after the death of Stalin, was far from total, with the bureaucracy pursuing different interests that were at times in conflict with the party, nor was the party itself monolithic from top to bottom, although factions were officially banned. The Supreme Soviet (successor of the Congress of Soviets) was nominally the highest organ of state authority for most of the Soviet history, at first acting as a rubber stamp institution, approving and implementing all decisions made by the party. However, its powers and functions were extended in the late 1950s, 1960s, and 1970s, including the creation of new state commissions and committees. It gained additional powers relating to the approval of the Five-Year Plans and the government budget. The Supreme Soviet elected a Presidium (successor of the Central Executive Committee) to wield its power between plenary sessions, ordinarily held twice a year, and appointed the Supreme Court, the Procurator General and the Council of Ministers (known before 1946 as the Council of People's Commissars), headed by the Chairman (Premier) and managing an enormous bureaucracy responsible for the administration of the economy and society. State and party structures of the constituent republics largely emulated the structure of the central institutions, although the Russian SFSR, unlike the other constituent republics, for most of its history had no republican branch of the CPSU, being ruled directly by the union-wide party until 1990. Local authorities were organized likewise into party committees, local Soviets and executive committees. While the state system was nominally federal, the party was unitary. The state security police (the KGB and its predecessor agencies) played an important role in Soviet politics. It was instrumental in the Red Terror and Great Purge, but was brought under strict party control after Stalin's death. Under Yuri Andropov, the KGB engaged in the suppression of political dissent and maintained an extensive network of informers, reasserting itself as a political actor to some extent independent of the party-state structure, culminating in the anti-corruption campaign targeting high-ranking party officials in the late 1970s and early 1980s. The constitution, which was promulgated in 1924, 1936 and 1977, did not limit state power. No separation of powers existed in the Soviet Union, as the state system was based on the unified state power of the highest organ of state authority, that is, the All-Union Supreme Soviet which worked under the party's leadership. The system was governed less by statute than by informal conventions, and no settled mechanism of leadership succession existed. Bitter and at times deadly power struggles took place in the Politburo after the deaths of Lenin and Stalin, as well as after Khrushchev's dismissal, itself due to a decision by both the Politburo and the Central Committee. All leaders of the Communist Party before Gorbachev died in office, except Georgy Malenkov and Khrushchev, who were both dismissed from the party leadership amid internal struggle within the party. Between 1988 and 1990, facing considerable opposition, Mikhail Gorbachev enacted reforms shifting power away from the highest bodies of the party and making the Supreme Soviet less dependent on them. The Congress of People's Deputies was established, the majority of whose members were directly elected in competitive elections held in March 1989, the first in Soviet history. The Congress now elected the Supreme Soviet, which became a full-time parliament, and much stronger than before. For the first time since the 1920s, it refused to rubber stamp proposals from the party and Council of Ministers. In 1990, Gorbachev introduced and assumed the position of the President of the Soviet Union, concentrated power in his executive office, independent of the party, and subordinated the government, now renamed the Cabinet of Ministers of the USSR, to himself. Tensions grew between the Union-wide authorities under Gorbachev, reformists led in Russia by Boris Yeltsin and controlling the newly elected Supreme Soviet of the Russian SFSR, and communist hardliners. On 19–21 August 1991, a group of hardliners staged a coup attempt. The coup failed, and the State Council of the Soviet Union became the highest organ of state power 'in the period of transition'. Gorbachev resigned as General Secretary, only remaining President for the final months of the existence of the USSR. The judiciary was not independent of the supreme state organ of power, and the Supreme Court as the supreme judicial organ supervised the lower courts (People's Court) and applied the law as established by the constitution or as interpreted by the Supreme Soviet. The Constitutional Oversight Committee reviewed the constitutionality of laws and acts. The Soviet Union used the inquisitorial system of Roman law, where the judge, procurator, and defence attorney collaborate to "establish the truth". Human rights in the Soviet Union were severely limited. The Soviet Union was a totalitarian state from 1927 until 1953 and a one-party state until 1990. Freedom of speech was suppressed and dissent was punished. Independent political activities were not tolerated, whether these involved participation in free labour unions, private corporations, independent churches or opposition political parties. The freedom of movement within and especially outside the country was limited. The state restricted rights of citizens to private property. According to the Universal Declaration of Human Rights, human rights are the "basic rights and freedoms to which all humans are entitled." including the right to life and liberty, freedom of expression, and equality before the law; and social, cultural and economic rights, including the right to participate in culture, the right to food, the right to work, and the right to education. The Soviet conception of human rights was very different from international law. According to Soviet legal theory, "it is the government who is the beneficiary of human rights which are to be asserted against the individual". The Soviet state was considered as the source of human rights. Therefore, the Soviet legal system considered law an arm of politics and it also considered courts agencies of the government. Extensive extrajudicial powers were given to the Soviet secret police agencies. In practice, the Soviet government significantly curbed the rule of law, civil liberties, protection of law and guarantees of property, which were considered as examples of "bourgeois morality" by Soviet law theorists such as Andrey Vyshinsky. The USSR and other countries in the Soviet Bloc had abstained from affirming the Universal Declaration of Human Rights (1948), saying that it was "overly juridical" and potentially infringed on national sovereignty.: 167–169 The Soviet Union later signed legally-binding human rights documents, such as the International Covenant on Civil and Political Rights in 1973 (and the 1966 International Covenant on Economic, Social and Cultural Rights), but they were neither widely known or accessible to people living under Communist rule, nor were they taken seriously by the Communist authorities.: 117 Under Joseph Stalin, the death penalty was extended to adolescents as young as 12 years old in 1935. Sergei Kovalev recalled "the famous article 125 of the Constitution which enumerated all basic civil and political rights" in the Soviet Union. But when he and other prisoners attempted to use this as a legal basis for their abuse complaints, their prosecutor's argument was that "the Constitution was written not for you, but for American Negroes, so that they know how happy the lives of Soviet citizens are". Crime was determined not as the infraction of law, instead, it was determined as any action which could threaten the Soviet state and society. For example, a desire to make a profit could be interpreted as a counter-revolutionary activity punishable by death. The liquidation and deportation of millions of peasants in 1928–31 was carried out within the terms of the Soviet Civil Code. Some Soviet legal scholars even said that "criminal repression" may be applied in the absence of guilt. Martin Latsis, chief of Soviet Ukraine's secret police explained: "Do not look in the file of incriminating evidence to see whether or not the accused rose up against the Soviets with arms or words. Ask him instead to which class he belongs, what is his background, his education, his profession. These are the questions that will determine the fate of the accused. That is the meaning and essence of the Red Terror." The purpose of public trials was "not to demonstrate the existence or absence of a crime – that was predetermined by the appropriate party authorities – but to provide yet another forum for political agitation and propaganda for the instruction of the citizenry (see Moscow Trials for example). Defense lawyers, who had to be party members, were required to take their client's guilt for granted..." During his rule, Stalin always made the final policy decisions. Otherwise, Soviet foreign policy was set by the commission on the Foreign Policy of the Central Committee of the Communist Party of the Soviet Union, or by the party's highest body the Politburo. Operations were handled by the separate Ministry of Foreign Affairs. It was known as the People's Commissariat for Foreign Affairs (or Narkomindel), until 1946. The most influential spokesmen were Georgy Chicherin, Maxim Litvinov, Vyacheslav Molotov, Andrey Vyshinsky, and Andrei Gromyko. Intellectuals were based in the Moscow State Institute of International Relations. The Marxist-Leninist leadership of the Soviet Union intensely debated foreign policy issues and changed directions several times. Even after Stalin assumed dictatorial control in the late 1920s, there were debates, and he frequently changed positions. During the country's early period, it was assumed that Communist revolutions would break out soon in every major industrial country, and it was the Russian responsibility to assist them. The Comintern was the weapon of choice. A few revolutions did break out, but they were quickly suppressed (the longest lasting one was in Hungary)—the Hungarian Soviet Republic—lasted only from 21 March 1919 to 1 August 1919. The Russian Bolsheviks were in no position to give any help. By 1921, Lenin, Trotsky, and Stalin realized that capitalism had stabilized itself in Europe and there would not be any widespread revolutions anytime soon. It became the duty of the Russian Bolsheviks to protect what they had in Russia, and avoid military confrontations that might destroy their bridgehead. Russia was now a pariah state, along with Germany. The two came to terms in 1922 with the Treaty of Rapallo that settled long-standing grievances. At the same time, the two countries secretly set up training programs for the illegal German army and air force operations at hidden camps in the USSR. Moscow eventually stopped threatening other states, and instead worked to open peaceful relationships in terms of trade, and diplomatic recognition. The United Kingdom dismissed the warnings of Winston Churchill and a few others about a continuing Marxist-Leninist threat, and opened trade relations and de facto diplomatic recognition in 1922. There was hope for a settlement of the pre-war Tsarist debts, but it was repeatedly postponed. Formal recognition came when the new Labour Party came to power in 1924. All the other countries followed suit in opening trade relations. Henry Ford opened large-scale business relations with the Soviets in the late 1920s, hoping that it would lead to long-term peace. Finally, in 1933, the United States officially recognized the USSR, a decision backed by the public opinion and especially by US business interests that expected an opening of a new profitable market. In the late 1920s and early 1930s, Stalin ordered Marxist-Leninist parties across the world to strongly oppose non-Marxist political parties, labour unions or other organizations on the left, which they labelled social fascists. In the usage of the Soviet Union, and of the Comintern and its affiliated parties in this period, the epithet fascist was used to describe capitalist society in general and virtually any anti-Soviet or anti-Stalinist activity or opinion. Stalin reversed himself in 1934 with the Popular Front program that called on all Marxist parties to join with all anti-Fascist political, labour, and organizational forces that were opposed to fascism, especially of the Nazi variety. The rapid growth of power in Nazi Germany encouraged both Paris and Moscow to form a military alliance, and the Franco-Soviet Treaty of Mutual Assistance was signed in May 1935. A firm believer in collective security, Stalin's foreign minister Maxim Litvinov worked very hard to form a closer relationship with France and Britain. In 1939, half a year after the Munich Agreement, the USSR attempted to form an anti-Nazi alliance with France and Britain. Adolf Hitler proposed a better deal, which would give the USSR control over much of Eastern Europe through the Molotov–Ribbentrop Pact. In September, Germany invaded Poland, and the USSR also invaded later that month, resulting in the partition of Poland. In response, Britain and France declared war on Germany, marking the beginning of World War II. Up until his death in 1953, Joseph Stalin controlled all foreign relations of the Soviet Union during the interwar period. Despite the increasing build-up of Germany's war machine and the outbreak of the Second Sino-Japanese War, the Soviet Union did not cooperate with any other nation, choosing to follow its own path. However, after Operation Barbarossa, the Soviet Union's priorities changed. Despite previous conflict with the United Kingdom, Vyacheslav Molotov dropped his post war border demands. The Cold War was a period of geopolitical tension between the United States and the Soviet Union and their respective allies, the Western Bloc and the Eastern Bloc, which began following World War II in 1945. The term cold war is used because there was no large-scale fighting directly between the two superpowers, but they each supported major regional conflicts known as proxy wars. The conflict was based around the ideological and geopolitical struggle for global influence by these two superpowers, following their temporary alliance and victory against Nazi Germany in 1945. Aside from the nuclear arsenal development and conventional military deployment, the struggle for dominance was expressed via indirect means such as psychological warfare, propaganda campaigns, espionage, far-reaching embargoes, rivalry at sports events and technological competitions such as the Space Race. Constitutionally, the USSR was a federation of constituent Union Republics, which were either unitary states, such as Ukraine or Byelorussia (SSRs), or federations, such as Russia or Transcaucasia (SFSRs), all four being the founding republics who signed the Treaty on the Creation of the USSR in December 1922. In 1924, during the national delimitation in Central Asia, Uzbekistan and Turkmenistan were formed from parts of Russia's Turkestan ASSR and two Soviet dependencies, the Khorezm and Bukharan PSPs. In 1929, Tajikistan was split off from the Uzbekistan SSR. With the constitution of 1936, the Transcaucasian SFSR was dissolved, resulting in its constituent republics of Armenia, Georgia and Azerbaijan being elevated to Union Republics, while Kazakhstan and Kirghizia were split off from the Russian SFSR, resulting in the same status. In August 1940, Moldavia was formed from parts of Ukraine and Soviet-occupied Bessarabia, and Ukrainian SSR. Estonia, Latvia and Lithuania were also annexed by the Soviet Union and turned into SSRs, which was not recognized by most of the international community and was considered an illegal occupation. After the Soviet invasion of Finland, the Karelo-Finnish SSR was formed on annexed territory as a Union Republic in March 1940 and then incorporated into Russia as the Karelian ASSR in 1956. Between July 1956 and September 1991, there were 15 union republics (see map below). Military Under the Military Law of September 1925, the Soviet Armed Forces consisted of the Land Forces, the Red Army Air Force, the Navy, Joint State Political Directorate (OGPU) and the Internal Troops. The OGPU later became independent and in 1934 joined the NKVD secret police, and so its internal troops were under the joint leadership of the defense and internal commissariats. After World War II, Strategic Missile Forces (1959), Air Defense Forces (1948) and National Civil Defense Forces (1970) were formed, which ranked first, third, and sixth in the official Soviet system of importance (ground forces were second, Air Force fourth, and Navy fifth). The army had the greatest political influence. In 1989, there served two million soldiers divided between 150 motorized and 52 tank divisions. Until the early 1960s, the Soviet navy was a rather small military branch, but after the Cuban Missile Crisis, under the leadership of Sergei Gorshkov, it expanded significantly. It became known for its submarine fleet. In 1989, there served 500 000 men. The Soviet Air Force focused on a fleet of strategic bombers and during war situation was to eradicate enemy infrastructure and nuclear capacity. The air force also had a number of fighters and tactical bombers to support the army in the war. Strategic missile forces had more than 1,400 intercontinental ballistic missiles (ICBMs), deployed between 28 bases and 300 command centers. After 1945, the Soviet Ground Forces suppressed several uprisings in East Europe and was involved in many other operations abroad. These included the suppression of the uprising in East Germany (1953), Hungarian revolution (1956) and the invasion of Czechoslovakia (1968). The Soviet Union also began the war in Afghanistan between 1979 and 1989. In the Soviet Union, general conscription applied, meaning all able-bodied males aged 18 and older were drafted in the armed forces. Economy The Soviet Union adopted a command economy, whereby production and distribution of goods were centralized and directed by the government. For the overwhelming majority of its existence, the USSR did not use GDP or GNP to measure its economy, instead relying on the Material Product System. The first Bolshevik experience with a command economy was the policy of war communism, which involved the nationalization of industry, centralized distribution of output, coercive or forced requisition of agricultural production, and attempts to eliminate money circulation, private enterprises and free trade. The barrier troops were also used to enforce Bolshevik control over food supplies in areas controlled by the Red Army, a role which soon earned them the hatred of the Russian civilian population. After the severe economic collapse, Lenin replaced war communism by the New Economic Policy (NEP) in 1921, legalizing free trade and private ownership of small businesses. The economy steadily recovered as a result. After a long debate among the members of the Politburo about the course of economic development, by 1928–1929, upon gaining control of the country, Stalin abandoned the NEP and pushed for full central planning, starting forced collectivization of agriculture and enacting draconian labour legislation. Resources were mobilized for rapid industrialization, which significantly expanded Soviet capacity in heavy industry and capital goods during the 1930s. The primary motivation for industrialization was preparation for war, mostly due to distrust of the outside capitalist world. As a result, the USSR was transformed from a largely agrarian economy into a great industrial power, leading the way for its emergence as a superpower after World War II. The war caused extensive devastation of the Soviet economy and infrastructure, which required massive reconstruction. By the early 1940s, the Soviet economy had become relatively self-sufficient; for most of the period until the creation of Comecon, only a tiny share of domestic products was traded internationally. After the creation of the Eastern Bloc, external trade rose rapidly. However, the influence of the world economy on the USSR was limited by fixed domestic prices and a state monopoly on foreign trade. Grain and sophisticated consumer manufactures became major import articles from around the 1960s. During the arms race of the Cold War, the Soviet economy was burdened by military expenditures, heavily lobbied for by a powerful bureaucracy dependent on the arms industry. At the same time, the USSR became the largest arms exporter to the Third World. A portion of Soviet resources during the Cold War were allocated in aid to the Soviet-aligned states. The Soviet Union's military budget in the 1970s was gigantic, forming 40–60% of the entire federal budget and accounting to 15% of the USSR's GDP (13% in the 1980s). From the 1930s until its dissolution in late 1991, the way the Soviet economy operated remained essentially unchanged. The economy was formally directed by central planning, carried out by Gosplan and organized in five-year plans. However, in practice, the plans were highly aggregated and provisional, subject to ad hoc intervention by superiors. All critical economic decisions were taken by the political leadership. Allocated resources and plan targets were usually denominated in rubles rather than in physical goods. Credit was discouraged, but widespread. The final allocation of output was achieved through relatively decentralized, unplanned contracting. Although in theory prices were legally set from above, in practice they were often negotiated, and informal horizontal links (e.g. between producer factories) were widespread. A number of basic services were state-funded, such as education and health care. In the manufacturing sector, heavy industry and defence were prioritized over consumer goods. Consumer goods, particularly outside large cities, were often scarce, of poor quality and limited variety. Under the command economy, consumers had almost no influence on production, and the changing demands of a population with growing incomes could not be satisfied by supplies at rigidly fixed prices. A massive unplanned second economy grew up at low levels alongside the planned one, providing some of the goods and services that the planners could not. The legalization of some elements of the decentralized economy was attempted with the reform of 1965. Although statistics of the Soviet economy are notoriously unreliable and its economic growth difficult to estimate precisely, by most accounts, the economy continued to expand until the mid-1980s. During the 1950s and 1960s, it had comparatively high growth and was catching up to the West. However, after 1970, the growth, while still positive, steadily declined much more quickly and consistently than in other countries, despite a rapid increase in the capital stock (the rate of capital increase was only surpassed by Japan). Professor of Economic History Bob Allen contends that in the era in which the Soviet economy was publicly owned and planned (1928–1989), the Soviet Union’s GDP per capita growth outpaced nearly all other world economies, trailing only Japan, South Korea, and Taiwan. Data shows that Soviet per capita growth expanded by a factor of 5.2, exceeding the growth rates of Western Europe at 4.0, and the USA, Canada, Australia, and New Zealand at 3.3. Ultimately, the Soviet model of public ownership proved more effective at raising average incomes during this period than the world's primary industrialized capitalist systems. According to Stephen Gowans, this growth trajectory was eventually undermined by the "accumulated toll on the Soviet economy of the West’s efforts to bring it down, the Reagan administration’s intensification of the Cold War, and the Soviet leadership’s inability to find a way out of the predicament these developments occasioned." Scholar Christopher Davidson argued that, during the 1980s, the Reagan administration attempted to weaponize the global energy market against the USSR. At the request of CIA Director William J. Casey, Saudi Arabia flooded the market with oil, dropping the price from $28 to $10 per barrel, and draining Soviet foreign currency reserves. This was later described by a former CIA chief of staff as a "body blow to the Soviets. It was the equivalent of stepping on their oxygen tube." Overall, the growth rate of per capita income in the Soviet Union between 1960 and 1989 was slightly above the world average (based on 102 countries). A 1986 study published in the American Journal of Public Health claimed that, citing World Bank data, the Soviet model provided a better quality of life and human development than market economies at the same level of economic development in most cases. According to Stanley Fischer and William Easterly, growth could have been faster. By their calculation, per capita income in 1989 should have been twice higher than it was, considering the amount of investment, education and population. The authors attribute this poor performance to the low productivity of capital. Steven Rosefielde states that the standard of living declined due to Stalin's despotism. While there was a brief improvement after his death, it lapsed into stagnation. In 1987, Mikhail Gorbachev attempted to reform and revitalize the economy with his program of perestroika. His policies relaxed state control over enterprises but did not replace it by market incentives, resulting in a sharp decline in output. The economy, already suffering from reduced petroleum export revenues, started to collapse. Prices were still fixed, and the property was still largely state-owned until after the country's dissolution. For most of the period after World War II until its collapse, Soviet GDP (PPP) was the second-largest in the world, and third during the second half of the 1980s, although on a per-capita basis, it was behind that of First World countries. Compared to countries with similar per-capita GDP in 1928, the Soviet Union experienced significant growth. In 1990, the country had a Human Development Index of 0.920, placing it in the 'high' category of human development. It was the third-highest in the Eastern Bloc, behind Czechoslovakia and East Germany, and the 25th in the world of 130 countries. The need for fuel declined in the Soviet Union from the 1970s to the 1980s, both per ruble of gross social product and per ruble of industrial product. The decline was very rapid between 1965 and 1970, then slowed between 1970 and 1975. From 1975 to 1980, the decline continued at an even slower rate, with fuel requirements per ruble of gross social product decreasing by only 2.6%. David Wilson, a historian, believed that the gas industry would account for 40% of Soviet fuel production by the end of the century. His theory did not come to fruition because of the USSR's collapse. According to Wilson, the Soviet Union was, in theory, well-positioned to avoid an energy crisis and could have sustained economic growth rates of 2–2.5% during the 1990s, supported by its energy resources. However, the energy sector faced many difficulties, among them the country's high military expenditure and hostile relations with the First World. In 1991, the Soviet Union had a pipeline network of 82,000 kilometres (51,000 mi) for crude oil and another 206,500 kilometres (128,300 mi) for natural gas. Petroleum and petroleum-based products, natural gas, metals, wood, agricultural products, and a variety of manufactured goods, primarily machinery, arms and military equipment, were exported. In the 1970s and 1980s, the USSR heavily relied on fossil fuel exports to earn hard currency. At its peak in 1988, it was the largest producer and second-largest exporter of crude oil, surpassed only by Saudi Arabia. The Soviet Union placed great emphasis on science and technology. Lenin believed the USSR would never overtake the developed world if it remained as technologically backward as it was upon its founding. Soviet authorities proved their commitment to Lenin's belief by developing massive networks and research and development organizations. In the early 1960s, 40% of chemistry PhDs in the Soviet Union were attained by women, compared with only 5% in the United States. By 1989, Soviet scientists were among the world's best-trained specialists in several areas, such as energy physics, selected areas of medicine, mathematics, welding, space technology, and military technologies. However, due to rigid state planning and bureaucracy, the Soviets remained far behind the First World in chemistry, biology, and computer science. Under Stalin, the Soviet government persecuted geneticists in favour of Lysenkoism, a pseudoscience rejected by the scientific community in the Soviet Union and abroad but supported by Stalin's inner circles. Implemented in the USSR and China, it resulted in reduced crop yields and is widely believed to have contributed to the Great Chinese Famine. In the 1980s, the Soviet Union had more scientists and engineers relative to the world's population than any other major country, owing to strong levels of state support. Some of its most remarkable technological achievements, such as launching the world's first space satellite, were achieved through military research. Under the Reagan administration, Project Socrates determined that the Soviet Union addressed the acquisition of science and technology in a manner radically different to the United States. The US prioritized indigenous research and development in both the public and private sectors. In contrast, the USSR placed greater emphasis on acquiring foreign technology, which it did through both covert and overt means. However, centralized state planning kept Soviet technological development greatly inflexible. This was exploited by the US to undermine the strength of the Soviet Union and thus foster its reform. At the end of the 1950s, the USSR constructed the first satellite—Sputnik 1, which marked the beginning of the Space Race—a competition to achieve superior spaceflight capability with the United States. This was followed by other successful satellites, most notably Sputnik 5, where test dogs were sent to space. On 12 April 1961, the USSR launched Vostok 1, which carried Yuri Gagarin, making him the first human to ever be launched into space and complete a space journey. The first plans for space shuttles and orbital stations were drawn up in Soviet design offices, but personal disputes between designers and management prevented their development. In terms of the Luna program, the USSR only had automated spacecraft launches with no crewed spacecraft. The N1—a Super heavy-lift launch vehicle intended to match the American Saturn V for a Soviet crewed moon landing—failed all four of its test launches, and the 'Moon' part of Space Race was won by the Americans. The Soviet public's reaction to the American moon-landing was mixed. The Soviet government limited the release of information about it, which affected the reaction. A portion of the populace did not give it attention, and another portion was angered. In the 1970s, specific proposals for the design of a space shuttle emerged, but shortcomings, especially in the electronics industry (rapid overheating of electronics), postponed it till the end of the 1980s. The first shuttle, the Buran, flew in 1988, but without a human crew. Another, Ptichka, endured prolonged construction and was canceled in 1991. For their launch into space, there is today an unused superpower rocket, Energia, which is the most powerful in the world. In the late 1980s, the Soviet Union built the Mir orbital station. It was built on the construction of Salyut stations and its only role was civilian-grade research tasks. Mir was the only orbital station in operation from 1986 to 1998. Gradually, other modules were added to it, including American modules. However, the station deteriorated rapidly after a fire on board and was deorbited in 2001, burning up in the Earth's atmosphere. Transport was a vital component of the country's economy. The economic centralization of the late 1920s and 1930s led to the development of infrastructure on a massive scale, most notably the establishment of Aeroflot, an aviation enterprise. The country had a wide variety of modes of transport by land, water and air. However, due to inadequate maintenance, much of the road, water and Soviet civil aviation transport were outdated and technologically backward compared to the First World. Soviet rail transport was the largest and most intensively used in the world; it was also better developed than most of its Western counterparts. By the late 1970s and early 1980s, Soviet economists were calling for the construction of more roads to alleviate some of the burdens from the railways and to improve the Soviet government budget. The street network and automotive industry remained underdeveloped, and dirt roads were common outside major cities. Soviet maintenance projects proved unable to take care of even the few roads the country had. By the early-to-mid-1980s, the Soviet authorities tried to solve the road problem by ordering the construction of new ones. Meanwhile, the automobile industry was growing at a faster rate than road construction. The underdeveloped road network led to a growing demand for public transport. Despite improvements, several aspects of the transport sector were still[when?] riddled with problems due to outdated infrastructure, lack of investment, corruption and bad decision-making. Soviet authorities were unable to meet the growing demand for transport infrastructure and services. The Soviet merchant navy was one of the largest in the world. Demographics Excess deaths throughout World War I and the Russian Civil War (including the famine of 1921–1922 that was triggered by Lenin's war communism policies) amounted to a combined total of 18 million, some 10 million in the 1930s, and more than 20 million in 1941–1945. The postwar Soviet population was 45 to 50 million smaller than it would have been if pre-war demographic growth had continued. According to Catherine Merridale, '... reasonable estimate would place the total number of excess deaths for the whole period somewhere around 60 million.' The birth rate of the USSR decreased from 44.0 per thousand in 1926 to 18.0 in 1974, mainly due to increasing urbanization and the rising average age of marriages. The mortality rate demonstrated a gradual decrease as well—from 23.7 per thousand in 1926 to 8.7 in 1974. In general, the birth rates of the southern republics in Transcaucasia and Central Asia were considerably higher than those in the northern parts of the Soviet Union, and in some cases even increased in the post–World War II period, a phenomenon partly attributed to slower rates of urbanization and traditionally earlier marriages in the southern republics. Soviet Europe moved towards sub-replacement fertility, while Soviet Central Asia continued to exhibit population growth well above replacement-level fertility. The late 1960s and the 1970s witnessed a reversal of the declining trajectory of the rate of mortality in the USSR, and was especially notable among men of working age, but was also prevalent in Russia and other predominantly Slavic areas of the country. An analysis of the official data from the late 1980s showed that after worsening in the late-1970s and the early 1980s, adult mortality began to improve again. The infant mortality rate increased from 24.7 in 1970 to 27.9 in 1974. Some researchers regarded the rise as mostly real, a consequence of worsening health conditions and services. The rises in both adult and infant mortality were not explained or defended by Soviet officials, and the Soviet government stopped publishing all mortality statistics for ten years. Soviet demographers and health specialists remained silent about the mortality increases until the late-1980s, when the publication of mortality data resumed, and researchers could delve into the real causes. The Soviet Union imposed heavy controls on city growth, preventing some cities from reaching their full potential while promoting others. For the entirety of the Soviet Union's existence, the most populous cities were Moscow and Leningrad (both in Russian SFSR), with the third far place taken by Kiev (Ukrainian SSR). At the USSR's inception, the fourth and fifth most populous cities were Kharkov (Ukrainian SSR) and Baku (Azerbaijan SSR), but, by the end of the century, Tashkent (Uzbek SSR), which had assumed the position of capital of Soviet Central Asia, had risen to fourth place. Minsk (Byelorussian SSR) saw rapid growth during the 20th century, rising from the 32nd most populous in the union to the 7th. Under Lenin, the state made explicit commitments to promote the equality of men and women. Many early Russian feminists and ordinary Russian working women actively participated in the Revolution, and many more were affected by the events of that period and the new policies. Beginning in October 1918, Lenin's government liberalized divorce and abortion laws, decriminalized homosexuality (re-criminalized in 1932), permitted cohabitation, and ushered in a host of reforms. However, without birth control, the new system produced many broken marriages, as well as countless out-of-wedlock children. The epidemic of divorces and extramarital affairs created social hardships when Soviet leaders wanted people to concentrate their efforts on growing the economy. Giving women control over their fertility also led to a precipitous decline in the birth rate, perceived as a threat to their country's military power. By 1936, Stalin reversed most of the liberal laws, ushering in a pronatalist era that lasted for decades. By 1917, Russia became the first great power to grant women the right to vote. After heavy casualties in World Wars I and II, women outnumbered men in Russia by a 4:3 ratio; this contributed to the larger role women played in Russian society compared to other great powers at the time. The Soviet Union repressed homosexuality. Even during the period when homosexuality was officially legal after the abolition of the Tsarist penal code criminalising it, Soviet courts attempted to repress non-traditional forms of sexuality, which were widely viewed by Russian revolutionaries as a form of capitalist decadence despite more liberal views on homosexuality from Soviet academic sexologists. After Stalin's consolidation of power, homosexuality became officially recriminalised in 1934. The increased homophobia during this time interval was driven by the economic demands of the First Five-Year Plan, as well as the NKVD's view of homosexuals as "socially harmful elements", although even during this heightened period of repression, a clandestine homosexual subculture was able to persist. Homosexuality remained a criminal offence throughout the remainder of the Soviet Union's existence. Anatoly Lunacharsky became the first People's Commissar for Education of Soviet Russia. In the beginning, the Soviet authorities placed great emphasis on the elimination of illiteracy. All left-handed children were forced to write with their right hand in the Soviet school system. Literate people were automatically hired as teachers.[citation needed] For a short period, quality was sacrificed for quantity. By 1940, Stalin could announce that illiteracy had been eliminated. Throughout the 1930s, social mobility rose sharply, which has been attributed to reforms in education. In the aftermath of World War II, the country's educational system expanded dramatically, which had a tremendous effect. In the 1960s, nearly all children had access to education, the only exception being those living in remote areas. Nikita Khrushchev tried to make education more accessible, making it clear to children that education was closely linked to the needs of society. Education also became important in giving rise to the New Man. Citizens directly entering the workforce had the constitutional right to a job and to free vocational training. The education system was highly centralized and universally accessible to all citizens, with affirmative action for applicants from nations associated with cultural backwardness. However, as part of a general antisemitic policy, an unofficial Jewish quota was applied[when?] in the leading institutions of higher education by subjecting Jewish applicants to harsher entrance examinations. The Brezhnev era also introduced a rule that required all university applicants to present a reference from the local Komsomol party secretary. According to statistics from 1986, the number of higher education students per the population of 10,000 was 181 for the USSR, compared to 517 for the US. The Soviet Union was an ethnically diverse country, with more than 100 distinct ethnic groups. The total population of the country was estimated at 293 million in 1991. According to a 1990 estimate, the majority of the population were Russians (50.78%), followed by Ukrainians (15.45%) and Uzbeks (5.84%). Overall, in 1989 the ethnic demography of the country showed that 69.8% was East Slavic, 17.5% was Turkic, 1.6% were Armenians, 1.6% were Balts, 1.5% were Uralic, 1.5% were Tajik, 1.4% were Georgian, 1.2% were Moldovan and 4.1% were of other various ethnic groups. All citizens of the USSR had their own ethnic affiliation. The ethnicity of a person was chosen at the age of sixteen by the child's parents. If the parents did not agree, the child was automatically assigned the ethnicity of the father. Partly due to Soviet policies, some of the smaller minority ethnic groups were considered part of larger ones, such as the Mingrelians of Georgia, who were classified with the linguistically related Georgians. Some ethnic groups voluntarily assimilated, while others were brought in by force. Russians, Belarusians, and Ukrainians, who were all East Slavic and Orthodox, shared close cultural, ethnic, and religious ties, while other groups did not. With multiple nationalities living in the same territory, ethnic antagonisms developed over the years.[neutrality is disputed] Members of various ethnicities participated in legislative bodies. Organs of power like the Politburo, the Secretariat of the Central Committee etc., were formally ethnically neutral, but in reality, ethnic Russians were overrepresented, although there were also non-Russian leaders in the Soviet leadership, such as Joseph Stalin, Grigory Zinoviev, Nikolai Podgorny or Andrei Gromyko. During the Soviet era, a significant number of ethnic Russians and Ukrainians migrated to other Soviet republics, and many of them settled there. According to the last census in 1989, the Russian 'diaspora' in the Soviet republics had reached 25 million. In 1917, before the revolution, health conditions were significantly behind those of developed countries. As Lenin later noted, "Either the lice will defeat socialism, or socialism will defeat the lice". The Soviet health care system was conceived by the People's Commissariat for Health in 1918. Under the Semashko model, health care was to be controlled by the state and would be provided to its citizens free of charge, a revolutionary concept at the time. Article 42 of the 1977 Soviet Constitution gave all citizens the right to health protection and free access to any health institutions in the USSR. Before Leonid Brezhnev became general secretary, the Soviet healthcare system was held in high esteem by many foreign specialists. This changed, however, from Brezhnev's accession and Mikhail Gorbachev's tenure as leader, during which the health care system was heavily criticized for many basic faults, such as the quality of service and the unevenness in its provision. Minister of Health Yevgeniy Chazov, during the 19th Congress of the Communist Party of the Soviet Union, while highlighting such successes as having the most doctors and hospitals in the world, recognized the system's areas for improvement and felt that billions of rubles were squandered. After the revolution, life expectancy for all age groups went up. These improvements continued into the 1960s when statistics indicated that the life expectancy briefly surpassed that of the United States;[citation needed] life expectancy started to decline in the 1970s, possibly because of alcohol abuse.[citation needed] At the same time, infant mortality began to rise. After 1974, the government stopped publishing statistics on the matter. This trend can be partly explained by the number of pregnancies rising drastically in the Asian part of the country where infant mortality was the highest while declining markedly in the more developed European part of the Soviet Union. Soviet dental technology and dental health were considered extremely bad; in 1991, the average 35-year-old had 12 to 14 cavities, fillings or missing teeth. Toothpaste was often not available, and toothbrushes did not conform to standards of modern dentistry. Under Lenin, the government gave small language groups their own writing systems. The development of these writing systems was highly successful, even though some flaws were detected. During the later days of the USSR, countries with the same multilingual situation implemented similar policies. A serious problem when creating these writing systems was that the languages differed dialectally greatly from each other. When a language had been given a writing system and appeared in a notable publication, it would attain 'official language' status. There were many minority languages which never received their own writing system; therefore, their speakers were forced to have a second language. There are examples where the government retreated from this policy, most notably under Stalin where education was discontinued in languages that were not widespread. These languages were then assimilated into another language, mostly Russian. During World War II, some minority languages were banned, and their speakers accused of collaborating with the enemy. As the most widely spoken of the Soviet Union's many languages, Russian de facto functioned as an official language, as the 'language of interethnic communication' (Russian: язык межнационального общения), but only assumed the de jure status as the official national language in 1990. Christianity and Islam had the highest number of adherents among the religious citizens. Eastern Christianity predominated among Christians, with Russia's traditional Russian Orthodox Church being the largest Christian denomination. About 90% of the Soviet Union's Muslims were Sunnis, with Shias being concentrated in the Azerbaijan SSR. Smaller groups included Roman Catholics, Jews, Buddhists, and a variety of Protestant denominations (especially Baptists and Lutherans). Religious influence had been strong in the Russian Empire. The Russian Orthodox Church enjoyed a privileged status as the church of the monarchy and took part in carrying out official state functions. The immediate period following the establishment of the Soviet state included a struggle against the Orthodox Church, which the revolutionaries considered an ally of the former ruling classes. In Soviet law, the 'freedom to hold religious services' was constitutionally guaranteed, although the ruling Communist Party regarded religion as incompatible with the Marxist spirit of scientific materialism. In practice, the Soviet system subscribed to a narrow interpretation of this right, and in fact used a range of official measures to discourage religion and curb the activities of religious groups. The 1918 Council of People's Commissars decree establishing the Russian SFSR as a secular state also decreed that 'the teaching of religion in all [places] where subjects of general instruction are taught, is forbidden. Citizens may teach and may be taught religion privately.' Among further restrictions, those adopted in 1929 included express prohibitions on a range of church activities, including meetings for organized Bible study. Both Christian and non-Christian establishments were shut down by the thousands in the 1920s and 1930s. By 1940, as many as 90% of the churches, synagogues, and mosques that had been operating in 1917 were closed; the majority of them were demolished or re-purposed for state needs with little concern for their historic and cultural value. More than 85,000 Orthodox priests were shot in 1937 alone. Only a twelfth of the Russian Orthodox Church's priests were left functioning in their parishes by 1941. In the period between 1927 and 1940, the number of Orthodox Churches in Russia fell from 29,584 to less than 500 (1.7%). The Soviet Union was officially a secular state, but a 'government-sponsored program of forced conversion to atheism' was conducted under the doctrine of state atheism. The government targeted religions based on state interests, and while most organized religions were never outlawed, religious property was confiscated, believers were harassed, and religion was ridiculed while atheism was propagated in schools. In 1925, the government founded the League of Militant Atheists to intensify the propaganda campaign. Accordingly, although personal expressions of religious faith were not explicitly banned, a strong sense of social stigma was imposed on them by the formal structures and mass media, and it was generally considered unacceptable for members of certain professions (teachers, state bureaucrats, soldiers) to be openly religious. While persecution accelerated following Stalin's rise to power, a revival of Orthodoxy was fostered by the government during World War II and the Soviet authorities sought to control the Russian Orthodox Church rather than liquidate it. During the first five years of Soviet power, the Bolsheviks executed 28 Russian Orthodox bishops and over 1,200 Russian Orthodox priests. Many others were imprisoned or exiled. Believers were harassed and persecuted. Most seminaries were closed, and the publication of most religious material was prohibited. By 1941, only 500 churches remained open out of about 54,000 in existence before World War I. Convinced that religious anti-Sovietism had become a thing of the past, and with the looming threat of war, the Stalin administration began shifting to a more moderate religion policy in the late 1930s. Soviet religious establishments overwhelmingly rallied to support the war effort during World War II. Amid other accommodations to religious faith after the German invasion, churches were reopened. Radio Moscow began broadcasting a religious hour, and a historic meeting between Stalin and Orthodox Church leader Patriarch Sergius of Moscow was held in 1943. Stalin had the support of the majority of the religious people in the USSR even through the late 1980s. The general tendency of this period was an increase in religious activity among believers of all faiths. Under Nikita Khrushchev, the state leadership clashed with the churches in 1958–1964, a period when atheism was emphasized in the educational curriculum, and numerous state publications promoted atheistic views. During this period, the number of churches fell from 20,000 to 10,000 from 1959 to 1965, and the number of synagogues dropped from 500 to 97. The number of working mosques also declined, falling from 1,500 to 500 within a decade. Religious institutions remained monitored by the Soviet government, but churches, synagogues, temples, and mosques were all given more leeway in the Brezhnev era. Official relations between the Orthodox Church and the government again warmed to the point that the Brezhnev government twice honored Orthodox Patriarch Alexy I with the Order of the Red Banner of Labour. A poll conducted by Soviet authorities in 1982 recorded 20% of the Soviet population as 'active religious believers.' Culture The culture of the Soviet Union evolved through several stages during its existence. During the first decade following the revolution, there was relative freedom and artists experimented with several different styles to find a distinctive Soviet style of art. Lenin wanted art to be accessible to the Russian people. On the other hand, hundreds of intellectuals, writers, and artists were exiled or executed, and their work banned, such as Nikolay Gumilyov who was shot for alleged conspiracy against the Bolsheviks, and Yevgeny Zamyatin. The government encouraged a variety of trends. In art and literature, numerous schools, some traditional and others radically experimental, proliferated. Communist writers Maxim Gorky and Vladimir Mayakovsky were active during this time. As a means of influencing a largely illiterate society, films received encouragement from the state, and much of director Sergei Eisenstein's best work dates from this period. During Stalin's rule, the Soviet culture was characterized by the rise and domination of the government-imposed style of socialist realism, with all other trends being severely repressed, with rare exceptions, such as Mikhail Bulgakov's works. Many writers were imprisoned and killed. Following the Khrushchev Thaw, censorship was diminished. During this time, a distinctive period of Soviet culture developed, characterized by conformist public life and an intense focus on personal life. Greater experimentation in art forms was again permissible, resulting in the production of more sophisticated and subtly critical work. The government loosened its emphasis on socialist realism; thus, for instance, many protagonists of the novels of author Yury Trifonov concerned themselves with problems of daily life rather than with building socialism. Underground dissident literature, known as samizdat, developed during this late period. In architecture, the Khrushchev era mostly focused on functional design as opposed to the highly decorated style of Stalin's epoch. In music, in response to the increasing popularity of forms of popular music like jazz in the West, many jazz orchestras were permitted throughout the USSR, notably the Melodiya Ensemble, named after the principle record label in the USSR. In the second half of the 1980s, Gorbachev's policies of perestroika and glasnost significantly expanded freedom of expression throughout the country in the media and the press. Sport In summer of 1923 in Moscow was established the Proletarian Sports Society "Dynamo" as a sports organization of Soviet secret police Cheka. On 13 July 1925 the Central Committee of the Russian Communist Party (Bolsheviks) adopted a statement "About the party's tasks in sphere of physical culture". In the statement was determined the role of physical culture in Soviet society and the party's tasks in political leadership of physical culture movement in the country. The Soviet Olympic Committee formed on 21 April 1951, and the IOC recognized the new body in its 45th session. In the same year, when the Soviet representative Konstantin Andrianov became an IOC member, the USSR officially joined the Olympic Movement. The 1952 Summer Olympics in Helsinki thus became first Olympic Games for Soviet athletes. The Soviet Union was the biggest rival to the United States at the Summer Olympics, winning six of its nine appearances at the games and also topping the medal tally at the Winter Olympics six times. The Soviet Union's Olympics success has been attributed to its large investment in sports to demonstrate its superpower image and political influence on a global stage. The Soviet Union national ice hockey team won nearly every world championship and Olympic tournament between 1954 and 1991 and never failed to medal in any International Ice Hockey Federation (IIHF) tournament in which they competed. The Soviet Olympic team was notorious for skirting the edge of amateur rules. All Soviet athletes held some nominal jobs, but were in fact state-sponsored and trained full-time. According to many experts, that gave the Soviet Union a huge advantage over the United States and other Western countries, whose athletes were students or real amateurs. Indeed, the Soviet Union monopolized the top place in the medal standings after 1968, and, until its collapse, placed second only once, in the 1984 Winter games, after another Eastern bloc nation, the GDR. Amateur rules were relaxed only in the late 1980s and were almost completely abolished in the 1990s, after the fall of the USSR. According to British journalist Andrew Jennings, a KGB colonel stated that the agency's officers had posed as anti-doping authorities from the International Olympic Committee (IOC) to undermine doping tests and that Soviet athletes were "rescued with [these] tremendous efforts". Documents obtained in 2016 revealed the Soviet Union's plans for a statewide doping system in track and field in preparation for the 1984 Summer Olympics in Los Angeles. Dated prior to the country's decision to boycott the Games, the document detailed the existing steroids operations of the program, along with suggestions for further enhancements. In the late 1980s, the government was persuaded to fund construction of a racing yacht specifically to take part in the 1989–1990 Whitbread Round the World Race with a Soviet crew. The 25 metre sloop Fazisi was built in 1989 to the design of Vladislav Murnikov in Poti, Georgia. She came a creditable 11th in a field of 23 boats, but the project was not repeated. Environment Neighbouring countries were aware of the high levels of pollution in the Soviet Union but after the dissolution of the Soviet Union it was discovered that its environmental problems were greater than what the Soviet authorities admitted. The Soviet Union was the world's second largest producer of harmful emissions. In 1988, total emissions in the Soviet Union were about 79% of those in the United States. But since the Soviet GNP was only 54% of that of the United States, this means that the Soviet Union generated 1.5 times more pollution than the United States per unit of GNP. The Chernobyl disaster in the Ukrainian SSR in 1986 was the first major accident at a civilian nuclear power plant. Unparalleled in the world, it resulted in a large number of radioactive isotopes being released into the atmosphere. Radioactive doses were scattered relatively far. Although long-term effects of the accident were unknown, 4,000 new cases of thyroid cancer which resulted from the accident's contamination were reported at the time of the accident, but this led to a relatively low number of deaths (WHO data, 2005). Another major radioactive accident was the Kyshtym disaster.[failed verification] The Kola Peninsula was one of the places with major problems. Around the industrial cities of Monchegorsk and Norilsk, where nickel, for example, is mined, all forests have been destroyed by contamination, while the northern and other parts of Russia have been affected by emissions. During the 1990s, people in the West were also interested in the radioactive hazards of nuclear facilities, decommissioned nuclear submarines, and the processing of nuclear waste or spent nuclear fuel. It was also known in the early 1990s that the USSR had transported radioactive material to the Barents Sea and Kara Sea, which was later confirmed by the Russian parliament. The crash of the K-141 Kursk submarine in 2000 in the west further raised concerns. In the past, there were accidents involving submarines K-19, K-8, a K-129, K-27, K-219 and K-278 Komsomolets. Legacy The legacy of the USSR remains a controversial topic. The socio-economic nature of communist states such as the USSR, especially under Stalin, has also been much debated, varyingly being labelled a form of bureaucratic collectivism, state capitalism, state socialism, or a totally unique mode of production. The USSR implemented a broad range of policies over a long period of time, with a large amount of conflicting policies being implemented by different leaders. Some have a positive view of it whilst others are critical towards the country, calling it a repressive oligarchy. The opinions on the USSR are complex and have changed over time, with different generations having different views on the matter as well as on Soviet policies corresponding to separate time periods during its history. Western academicians published various analyses of the post-Soviet states' development, claiming that the dissolution was followed by a severe drop in economic and social conditions in these countries, including a rapid increase in poverty, crime, corruption, unemployment, homelessness, rates of disease, infant mortality and domestic violence, as well as demographic losses, income inequality and the rise of an oligarchical class, along with decreases in calorie intake, life expectancy, adult literacy, and income. Between 1988–1989 and 1993–1995, the Gini ratio (a measure of inequality) increased by an average of 9 percentage points for all former Soviet republics. According to Western analysis, the economic shocks that accompanied wholesale privatization were associated with sharp increases in mortality, Russia, Kazakhstan, Latvia, Lithuania, and Estonia saw a tripling of unemployment and a 42% increase in male death rates between 1991 and 1994, and in the following decades, only five or six of the post-communist states are on a path to joining the wealthy capitalist West while most are falling behind, some to such an extent that it will take over fifty years to catch up to where they were before the fall of the Soviet Bloc. As of 2011, the experience of the former Soviet republics was mixed, with some having recovered in terms of gross domestic product and others not. There are large wealth disparities, and many post-soviet economies are described as oligarchic. Since the dissolution of the Soviet Union, annual polling by the Levada Center has shown that over 50% of Russia's population regretted this event, with the only exception to this being in 2012 when support for the Soviet Union dipped below 50 percent. A 2018 poll showed that 66% of Russians regretted the fall of the Soviet Union, setting a 15-year record, and the majority of these regretting opinions came from people older than 55. In 2020, polls conducted by the Levada Center found that 75% of Russians agreed that the Soviet era was the greatest era in their country's history. According to the New Russia Barometer (NRB) polls by the Centre for the Study of Public Policy, 50% of Russian respondents reported a positive impression of the Soviet Union in 1991. This increased to about 75% of NRB respondents in 2000, dropping slightly to 71% in 2009. Throughout the 2000s, an average of 32% of NRB respondents supported the restoration of the Soviet Union. In a 2021 poll, a record 70% of Russians indicated they had a mostly/very favourable view of Joseph Stalin. In Armenia, 12% of respondents said the USSR collapse did good, while 66% said it did harm. In Kyrgyzstan, 16% of respondents said the collapse of the USSR did good, while 61% said it did harm. In a 2018 Rating Sociological Group poll, 47% of Ukrainian respondents had a positive opinion of Soviet leader Leonid Brezhnev, who ruled the Soviet Union from 1964 to 1982, while viewing Lenin, Stalin, and Gorbachev very negatively. A 2021 poll conducted by the Levada Center found that 49% of Russians prefer the USSR's political system, while 18% prefer the current political system and 16% would prefer a Western democracy. A further 62% of people polled preferred the Soviet system of central planning, while 24% prefer a market-based system. According to the Levada Center's polls, the primary reasons cited for Soviet nostalgia are the advantages of the shared economic union between the Soviet republics, including perceived financial stability. This was referenced by up to 53% of respondents in 2016. At least 43% also lamented the loss of the Soviet Union's global political superpower status. About 31% cited the loss of social trust and capital. The remainder of the respondents cited a mix of reasons ranging from practical travel difficulties to a sense of national displacement. The 1941–1945 period of World War II is still known in Russia as the 'Great Patriotic War'. The war became a topic of great importance in cinema, literature, history lessons at school, the mass media, and the arts. As a result of the massive losses suffered by the military and civilians during the conflict, Victory Day celebrated on 9 May is still one of the most important and emotional dates in Russia. Catherine Wanner asserts that Victory Day commemorations are a vehicle for Soviet nostalgia, as they "kept alive a mythology of Soviet grandeur, of solidarity among the Sovietskii narod, and of a sense of self as citizen of a superpower state". Russian Victory Day parades are organized annually in most cities, with the central military parade taking place in Moscow (just as during the Soviet times). Additionally, the recently introduced Immortal Regiment on 9 May sees millions of Russians carry the portraits of their relatives who fought in the war. Russia also retains other Soviet holidays, such as the Defender of the Fatherland Day (23 February), International Women's Day (8 March), and International Workers' Day. In some post-Soviet republics, there is a more negative view of the USSR, although there is no unanimity on the matter. In large part due to the Holodomor, ethnic Ukrainians have a negative view of the Soviet Union. Russian-speaking Ukrainians of Ukraine's southern and eastern regions have a more positive view of the USSR. In some countries with internal conflict, there is also nostalgia for the USSR, especially for refugees of the post-Soviet conflicts who have been forced to flee their homes and have been displaced. The many Russian enclaves in the former USSR republics such as Transnistria have in a general a positive remembrance of it. As its counterpart as a powerful communist state, the Chinese Communist Party (CCP) has continually placed an emphasis on understanding the Soviet Union and its collapse as lessons for itself. In 2011, the CCP completed a study focusing on four reasons for the Soviet collapse. First, Gorbachev's rapid pursuit of democracy which undermined the centrality of the Communist Party. Second, rapid privatization of state-owned enterprises. Third, the end of the ideological monopoly of the Communist Party, leading to historical nihilism and attacks on socialism. Fourth, the West's promotion of a peaceful evolution, cultivating a pro-West "fifth column" in Soviet society. A 2023 Center for Strategic and International Studies report argued modern Chinese scholarship's attributes the Soviet collapse primarily to its concept of historical nihilism, equated to the penetration of Western ideas into society. In December 1989, then-leader Jiang Zemin first attributed both the fall of communism in Eastern Europe and the Tiananmen Square protests to historical nihilism. A second current in Chinese writing are from Sovietologists who argue the Communist Party of the Soviet Union's institutions and policies was more responsible for collapse than its ideology. Despite Xi Jinping's focus on the historical nihilism current, he stated in 2021 "the Soviet Communist Party separated itself from the people and became a privileged bureaucratic group". The report also noted that between 1960 and 1980, China's state propaganda chose to publish "gray-cover books" (Chinese: 灰皮书, romanized: huī pí shū), a large range of socialist and anti-socialist foreign literature criticizing the Soviet system. These included Leon Trotsky's The Revolution Betrayed and Stalin, works of Eduard Bernstein and Karl Kautsky, Milovan Djilas' The New Class, and Friedrich Hayek's The Road to Serfdom. The report suggested these publications influenced interest in the Soviet failures among Chinese leaders Xi, Jiang, and Deng Xiaoping. The left's view of the USSR is complex.[citation needed] While some leftists regard the USSR as an example of state capitalism or that it was an oligarchical state, other leftists admire Vladimir Lenin and the Russian Revolution. Council communists generally view the USSR as failing to create class consciousness, turning into a corrupt state in which the elite controlled society. Trotskyists believe that the ascendancy of the Stalinist bureaucracy ensured a degenerated or deformed workers' state, where the capitalist elite have been replaced by an unaccountable bureaucratic elite and there is no true democracy or workers' control of industry. In particular, American Trotskyist David North noted that the generation of bureaucrats that rose to power under Stalin's tutelage presided over the stagnation and breakdown of the Soviet Union. Many anti-Stalinist leftists such as anarchists are extremely critical of Soviet authoritarianism and repression. Much of the criticism it receives is centered around massacres in the Soviet Union, the centralized hierarchy present in the USSR and mass political repression as well as violence towards government critics and political dissidents such as other leftists. Critics also point towards its failure to implement any substantial worker cooperatives or implementing worker liberation, as well as corruption and the Soviet authoritarian nature.[citation needed] Anarchists are also critical of the country, labeling the Soviet system as red fascism. Factors contributing to the anarchist animosity towards the USSR included the Soviet destruction of the Makhnovist movement after an initial alliance, the suppression of the anarchist Kronstadt rebellion, and the defeat of the rival anarchist factions by the Soviet-supported Communist faction during the Spanish Civil War. Maoists also have a mixed opinion on the USSR, viewing it negatively during the Sino-Soviet Split and denouncing it as revisionist and reverted to capitalism. The Chinese government in 1963 articulated its criticism of the USSR's system and promoted China's ideological line as an alternative. After the dissolution of the Soviet Union, the Japanese Communist Party (JCP) released a press statement titled "We welcome the end of a party which embodied the historical evil of great power chauvinism and hegemonism". Noam Chomsky called the collapse of the Soviet Union "a small victory for socialism, not only because of the fall of one of the most anti-socialist states in the world, where working people had fewer rights than in the West, but also because it freed the term 'socialism' from the burden of being associated in the propaganda systems of East and West with Soviet tyranny—for the East, in order to benefit from the aura of authentic socialism, for the West, in order to demonize the concept." Some scholars on the left have posited that the end of the Soviet Union and communism as a global force allowed neoliberal capitalism to become a global system, which has resulted in rising economic inequality. In her 2012 book The Communist Horizon, Jodi Dean argued that there is a double standard among all sides of the political spectrum, including conservatives, liberals, and social democrats, in how communism and capitalism are perceived nearly two decades after the dissolution of the Soviet Union. Dean stated that the worst excesses of capitalism are often minimized, while communism is often equated only with the Soviet Union, and experiments in Eastern Europe, Latin America, Africa, and Asia are often ignored, with an emphasis placed on the Stalin era and its violent excesses including gulags, purges, droughts and famines, and almost no consideration for the industrialization and modernization of the Soviet economy, the successes of Soviet science (such as the Soviet space program), or the rise in the standard of living for the once predominantly agrarian society. The dissolution of the Soviet Union is therefore seen as the proof that communism cannot work, allowing for all left-wing criticism of the excesses of neoliberal capitalism to be silenced, for the alternatives would supposedly inevitably result in economic inefficiency and violent authoritarianism. Michael Parenti's 1997 book Blackshirts and Reds takes the controversial position of defending the Soviet Union and other communist countries from reflexive condemnation, arguing that they featured a number of advantages over capitalist countries, e.g., by ensuring less economic inequality. He later argues that the Soviet Union's "well-publicized deficiencies and injustices" were exacerbated by the Russian Civil War, the Nazi-led multinational invasion, and by non-military modes of capitalist intervention against the Eastern Bloc. Moreover, he claims that "pure socialists" and "left anticommunists" had failed to specify a viable alternative to the "siege socialism" implemented in the Soviet model. Parenti argued the Soviet Union played a crucial role in "tempering the worst impulses of Western capitalism and imperialism" that in the post-Cold War era is "no longer restrained by a competing system" and is now "rolling back the many gains that working people in the West have won over the years". By offering a rare defense of 20th century Communism, Blackshirts and Reds has elicited strong reactions from anarchist and Communist publications. Both liberals and neoconservatives in the United States celebrated the dissolution of the Soviet Union. Ideas such as Francis Fukuyama's end of history and Charles Krauthammer's unipolar moment gained prominence to describe the victory of Western liberal democracy. At the same time, both sides of the political spectrum in the US criticized the George H. W. Bush administration for its comparative cautiousness and political support of Gorbachev over more pro-democracy and nationalist forces in the Soviet Union. See also Notes References Bibliography External links 1918–24 Turkestan3 1918–41 Volga German4 1919–92 Bashkir 1920–25 Kirghiz2 1920–92 Tatar 1921–90 Adjarian 1921–45 Crimean 1921–92 Dagestan 1921–24 Mountain 1921–90 Nakhichevan 1922–92 Yakut 1923–92 Buryat1 1923–40 Karelian 1924–40 Moldavian 1924–29 Tajik 1925–92 Chuvash5 1925–36 Kazakh2 1926–36 Kirghiz 1931–92 Abkhaz 1932–92 Karakalpak 1934–93 Mordovian 1934–92 Udmurt6 1935–43 Kalmyk 1936–44 Checheno-Ingush 1936–44 Kabardino-Balkarian 1936–92 Komi 1936–92 Mari 1936–93 North Ossetian 1944–57 Kabardin 1956–91 Karelian 1957–93 Checheno-Ingush 1957–92 Kabardino-Balkarian 1958–92 Kalmyk 1961–92 Tuvan 1990–92 Gorno-Altai 1991–92 Crimean |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-play-31] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-32] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-32] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Superpower] | [TOKENS: 4483] |
Contents Superpower Superpower describes a sovereign state or supranational union that holds a dominant position characterized by the ability to exert influence and project power on a global scale. This is done through the combined means of economic, military, technological, political, and cultural strength as well as diplomatic and soft power influence. Traditionally, superpowers are preeminent among the great powers. While a great power state is capable of exerting its influence globally, superpowers are states so influential that no significant action can be taken by the global community without first considering the positions of the superpowers on the issue. In 1944, during World War II, the term was first applied to the British Empire, the Soviet Union, and the United States. During the Cold War, the British Empire dissolved, leaving the United States and the Soviet Union to dominate world affairs. At the end of the Cold War and the dissolution of the Soviet Union in 1991, the United States became the world's sole superpower, a position sometimes referred to as that of a "hyperpower". Since the late 2010s and into the 2020s, China has increasingly been described as an emerging superpower or an established one, as it is "the only country with enough power to jeopardize the current global order". Despite the United States perceived decline it remains a superpower primarily due to its alliances and economic influence, despite power decline and recent international reputation. Origin No agreed definition of what a superpower is exists and may differ between sources. However, a fundamental characteristic that is consistent with all definitions of a superpower is a nation or state that has mastered the seven dimensions of state power, namely geography, population, economy, resources, military, diplomacy, and national identity. The term was first used to describe nations with greater than great power status as early as 1944, but only gained its specific meaning with regard to the United States and the Soviet Union after World War II. This was because the United States and the Soviet Union had proved themselves to be capable of casting great influence in global politics and military dominance. The term in its current political meaning was coined by Dutch-American geostrategist Nicholas Spykman in a series of lectures in 1943 about the potential shape of a new post-war world order. This formed the foundation for the book The Geography of the Peace, which referred primarily to the unmatched maritime global supremacy of the British Empire and the United States as essential for peace and prosperity in the world.[citation needed] A year later, William T. R. Fox, an American foreign policy professor, elaborated on the concept in the book The Superpowers: The United States, Britain and the Soviet Union – Their Responsibility for Peace which spoke of the global reach of a super-empowered nation. Fox used the word superpower to identify a new category of power able to occupy the highest status in a world in which—as the war then raging demonstrated—states could challenge and fight each other on a global scale. According to him, at that moment, there were three states that were superpowers, namely the United States, the Soviet Union, and the United Kingdom. The British Empire was the most extensive empire in world history and considered the foremost power, holding sway over 25% of the world's population and controlling about 25% of the Earth's total land area, while the United States and the Soviet Union grew in power before and during World War II. The UK would face serious political, financial, and colonial issues after World War II that left it unable to match Soviet or American power. Ultimately, Britain's empire would gradually dissolve over the course of the 20th century, sharply reducing its global power projection. According to Lyman Miller, "[t]he basic components of superpower stature may be measured along four axes of power: military, economic, political, and cultural (or what political scientist Joseph Nye has termed 'soft power')". In the opinion of Kim Richard Nossal of Queen's University in Canada, "generally, this term was used to signify a political community that occupied a continental-sized landmass; had a sizable population (relative at least to other major powers); a superordinate economic capacity, including ample indigenous supplies of food and natural resources; enjoyed a high degree of non-dependence on international intercourse; and, most importantly, had a well-developed nuclear capacity (eventually, normally defined as second strike capability)". In the opinion of Professor Paul Dukes, "a superpower must be able to conduct a global strategy, including the possibility of destroying the world; to command vast economic potential and influence; and to present a universal ideology", although "many modifications may be made to this basic definition". According to Professor June Teufel Dreyer, "[a] superpower must be able to project its power, soft and hard, globally". In his book Superpower: Three Choices for America's Role in the World, Dr. Ian Bremmer, president of the Eurasia Group, argues that a superpower is "a country that can exert enough military, political, and economic power to persuade nations in every region of the world to take important actions they would not otherwise take". Apart from its common denotation of the foremost post-WWII states, the term superpower has colloquially been applied by some authors retrospectively to describe various preeminent ancient great empires or medieval great powers, in works such as Channel 5 (UK)'s documentary Rome: The World's First Superpower or the reference in The New Cambridge Medieval History to "the other superpower, Sasanian Persia". During the Cold War The 1956 Suez Crisis suggested that Britain, financially weakened by two world wars, could not then pursue its foreign policy objectives on an equal footing with the new superpowers without sacrificing convertibility of its reserve currency as a central goal of policy. As the majority of World War II had been fought far from its national boundaries, the United States had not suffered the industrial destruction nor massive civilian casualties that marked the wartime situation of the countries in Europe or Asia. The war had reinforced the position of the United States as the world's largest long-term creditor nation and its principal supplier of goods; moreover, it had built up a strong industrial and technological infrastructure that had greatly advanced its military strength into a primary position on the global stage. Despite attempts to create multinational coalitions or legislative bodies (such as the United Nations), it became increasingly clear that the superpowers had very different visions about what the post-war world ought to look like and after the withdrawal of British aid to Greece in 1947, the United States took the lead in containing Soviet expansion in the Cold War. The two countries opposed each other ideologically, politically, militarily, and economically. The Soviet Union promoted the ideology of Marxism–Leninism, planned economy, and a one-party state while the United States promoted the ideologies of liberal democracy and the free market in a capitalist market economy. This was reflected in the Warsaw Pact and NATO military alliances, respectively, as most of Europe became aligned with either the United States or the Soviet Union. These alliances implied that these two nations were part of an emerging bipolar world, in contrast with a previously multipolar world. The idea that the Cold War period revolved around only two blocs, or even only two nations, has been challenged by some scholars in the post–Cold War era, who have noted that the bipolar world only exists if one ignores all of the various movements and conflicts that occurred without influence from either of the two superpowers. Additionally, much of the conflict between the superpowers was fought in proxy wars, which more often than not involved issues more complex than the standard Cold War oppositions. After the Soviet Union disintegrated in the early 1990s, the term "hyperpower" began to be applied to the United States as the sole remaining superpower of the Cold War era. This term, popularized by French foreign minister Hubert Védrine in the late 1990s, is controversial and the validity of classifying the United States in this way is disputed. One notable opponent to this theory is Samuel P. Huntington, who rejects this theory in favor of a multipolar balance of power. Other international relations theorists such as Henry Kissinger theorize that because the threat of the Soviet Union no longer exists to formerly American-dominated regions such as Western Europe and Japan, American influence is only declining since the end of the Cold War because such regions no longer need protection or have necessarily similar foreign policies as the United States. After the Cold War After the dissolution of the Soviet Union in 1991 which ended the Cold War, the post–Cold War world has in the past been considered by some to be a unipolar world, with the United States as the world's sole remaining superpower. In 1999, political scientist and author Samuel P. Huntington wrote: "The United States, of course, is the sole state with preeminence in every domain of power – economic, military, diplomatic, ideological, technological, and cultural – with the reach and capabilities to promote its interests in virtually every part of the world". However, Huntington rejected the claim that the world was unipolar, arguing: "There is now only one superpower. But that does not mean that the world is unipolar", describing it instead as "a strange hybrid, a uni-multipolar system with one superpower and several major powers". He further wrote that "Washington is blind to the fact that it no longer enjoys the dominance it had at the end of the Cold War. It must relearn the game of international politics as a major power, not a superpower, and make compromises". Experts argue that this older single-superpower assessment of global politics is too simplified, in part because of the difficulty in classifying the European Union at its current stage of development. Others argue that the notion of a superpower is outdated, considering complex global economic interdependencies and propose that the world is multipolar. A 2012 report by the National Intelligence Council predicted that the United States superpower status will have eroded to merely being first among equals by 2030, but that it would remain highest among the world's most powerful countries because of its influence in many different fields and global connections that the great regional powers of the time would not match. Additionally, some experts have suggested the possibility of the United States losing its superpower status completely in the future, citing speculation of its decline in power relative to the rest of the world, economic hardships, a declining dollar, Cold War allies becoming less dependent on the United States, and the emergence of future powers around the world. According to a RAND Corporation paper by American diplomat James Dobbins, Professor Howard J. Shatz, and policy analyst Ali Wyne, Russia in the breakdown of a disintegrating unipolar world order, while not a peer competitor to the United States, would still remain a player and a potential rogue state that would undermine global affairs. The West could contain Russia with methods like those employed during the Cold War with the Soviet Union, though this would be tested by Russia's overt and covert efforts to destabilize Western alliances and political systems. On the other hand, China is a peer competitor to the United States that cannot be contained, and will be a far more challenging entity for the West to confront. The authors state that China's military dominance in the Asia-Pacific is already eroding American influence at a rapid pace, and the costs for the US to defend its interests there will continue to rise. Moreover, China's economic influence has already broken out of its regional confines long ago and is on track to directly contest the US role as the center for economic trade and commerce. Potential superpowers The term potential superpowers has been applied by scholars and other qualified commentators to the possibility of several political entities achieving superpower status. In the 1980s, some commentators thought Japan would become a superpower due to its large GDP and high economic growth at the time. However, Japan's economy crashed in 1991, creating a long period of economic slump in the country which has become known as the Lost Decades. Due to their large markets, growing military strength, economic potential, and influence in international affairs, China, the European Union, Russia, and India are among the political entities most cited as having the potential of achieving superpower status in the 21st century, with China often seen as the only de facto or near-superpower that rivals the United States. In 2013, some political scientists and other commentators have suggested that such countries might simply be emerging powers, as opposed to potential superpowers, In 2020, the European Union has been called a "regulatory superpower" due to the Brussels effect. The classification of China as a superpower has been subject of great academic and geopolitical debate. Debate centers around recognizing it as either a de facto superpower or a superpower contender. Since a 2012 Cornell University - Lund Critical Debate that concluded "China is not yet a superpower", an increasing amount of proponents have highlighted China's modern military, regional influence, cultural export, rapid advancements in artificial intelligence, economic and manufacturing volume as signs of global dominance in the 2020s. However, opponent's suggest that domestic challenges still persist; such as an ageing and shrinking population, lack of skilled immigration, alongside international concerns of its soft power status due to human rights issues, lack of hard power capabilities through a global military alliance system, and the dominance of the U.S. dollar in global trade. Increasing doubts have emerged in 2022 around the potential of Russia to gain superpower status given its declining economy, severe military underperformance during the invasion of Ukraine, and its loss of influence in Central Asia, a region once dominated by Moscow for centuries. Superpower collapse Dramatic changes occurred in the Soviet Union and the Eastern Bloc during the 1980s and early 1990s, with perestroika and glasnost, the fall of the Berlin Wall in November 1989, and finally the dissolution of the Soviet Union in December 1991. As early as 1970, Andrei Amalrik had made predictions of Soviet collapse, and Emmanuel Todd made a similar prediction in 1976. Due to Russia's capabilities of conventional warfare during the Russian invasion of Ukraine Russia was compared to a "Potemkin Superpower" by Paul Krugman. Russia is a nuclear-weapon state. The Suez Crisis of 1956 is considered by some commentators to be the beginning of the end of Britain's period as a superpower, but other commentators have pointed earlier such as the postwar Age of Austerity, the Anglo-American loan of 1946, the Winter of 1946–47, and the independence of British India as other key points[failed verification] in Britain's decline and loss of superpower status. The Suez Crisis in particular is regarded by historians to be a political and diplomatic disaster for the British Empire, as it led to large-scale international condemnation, including extensive pressure from the United States and Soviet Union. This forced the British and the French to withdraw in embarrassment and cemented the increasingly-bipolar Cold War politics between the Soviet Union and United States. In the 1960s, the movement for decolonization reached its peak, with remaining imperial holdings achieving independence, accelerating the transition from the British Empire to the Commonwealth of Nations. As the Empire continued to crumble, the home islands of the United Kingdom later experienced deindustrialization throughout the 1970s, coupled with high inflation and industrial unrest that unraveled the postwar consensus. This led to some economists to refer to Britain as the Sick Man of Europe. In 1976, the United Kingdom had to seek assistance from the International Monetary Fund (IMF) which it had previously ironically helped create, receiving funding of $3.9 billion, the largest-ever loan to be requested up until that point. In 1979, the country suffered major widespread strikes known as the Winter of Discontent. All these factors were seen by academics, economists and politicians as symbolising Britain's postwar decline. Lastly, the Handover of Hong Kong to China in July 1997 was seen by experts as the definitive end of the British Empire. Nevertheless, the United Kingdom today has retained global soft power in the 21st century. Its capital city, London, continues to be regarded as one of the pre-eminent cities in the world, being ranked as a global city by the Mori Foundation. In 2022, the United Kingdom was ranked the foremost European country in terms of soft power by Brand Finance. The United Kingdom also retains a formidable military and is one of the recognized nuclear-weapons states. In After the Empire: The Breakdown of the American Order (2001), French sociologist Emmanuel Todd predicts the eventual decline and fall of the United States as a superpower. "After years of being perceived as a problem-solver, the US itself has now become a problem for the rest of the world." Since the 2010s, as a result of asymmetric polarization within the United States, as well as globally perceived U.S. foreign policy failures, and China's growing influence around the world, some academics and geopolitical experts have argued that the United States may already be experiencing a decay in its soft power around the world. Superpower disengagement Superpower disengagement is a foreign policy option whereby the most powerful nations, the superpowers, reduce their interventions in an area. Such disengagement could be multilateral among superpowers or lesser powers, or bilateral between two superpowers, or unilateral. It could mean an end to either direct or indirect interventions. For instance, disengagement could mean that the superpowers remove their support of proxies in proxy wars to de-escalate a superpower conflict back to a local problem based on local disputes. Disengagement can create buffers between superpowers that might prevent conflicts or reduce the intensity of conflicts.[citation needed] The term usually refers to various policy proposals during the Cold War which attempted to defuse tensions between the Soviet Union and the United States, largely because of the risk of any superpower conflict to escalate to nuclear war. Examples of one-sided disengagement include when Joseph Stalin decided to end Soviet support for the communist guerrillas in Greece during the Greek Civil War, and when Richard Nixon withdrew US troops from Vietnam in the early 1970s.[citation needed] The more important candidates for disengagement were where Soviet and US forces faced each other directly such as in Germany and Austria. The Austrian State Treaty is an example of formal, multilateral, superpower disengagement which left Austria as neutral for the duration of the Cold War, with Austria staying out of the Warsaw Pact, NATO, and the European Economic Community. The 1952 Stalin Note is perhaps the most controversial proposal of superpower disengagement from Germany. Proposed early superpowers These are proposed examples of ancient or historical superpowers, taking into account that the knowledge of what the "known world" comprised was extremely limited in past eras (for example, Europeans became aware of the existence of the Americas and Australia only after the Age of Discovery, which began in the late 15th century, and prior to this era, they had a very limited knowledge about East Asia as well). Many of the nations of this historical period were never superpowers, however they were regional powers with influence in their respective regions. Note: Does not take into account city-states and stateless nomadic peoples. In the early history of both regions contact between these civilization was very limited, long distance trade definitely occurred but primarily through long chains of intermediaries rather than directly. Regular contact between Egypt, Mesopotamia and Anatolia dates from this period. Mitanni was an important intermediary in the trade between these civilizations. Known by the Minoans and Mycenaean Greeks: Contact with other civilizations was very limited; long distance trade with Mesopotamia definitely occurred but primarily through long chains of intermediaries rather than directly. The Drachma, minted by many states, most notably in the Ptolemaic Egypt was the reserve currency in the Mediterranean and Near East Main reserve currency in the Mediterranean and Near East: Roman Denarius, later replaced by the Roman Solidus. Not fully known outside East Asia. The West knew of these powers because of the Silk Road, although little information reached them. Isolated civilizations in relation to the Afro-Eurasia. Isolated civilization in relation to Afro-Eurasia. Main reserve currency in the Mediterranean and Near East: Roman Solidus, later replaced by the Dinar, minted by the Caliphates. During the Middle Ages the region was known by Arab merchants. Europeans were aware that the region existed (to the point that Mansa Musa was mentioned in the Catalan Atlas), but little information about the place reached Europe. Isolated civilization in relation to the Afro-Eurasia. Isolated civilizations in relation to the Afro-Eurasia. The Age of Discovery brought a broad change in globalization, being the first period in which previously isolated parts of the world became connected to form the world system, and the first colonial empires of the early modern age emerged, such as the Portuguese, Spanish, Dutch and French empires. The British Empire, after its Glorious Revolution in 1688 and its pioneering role in the industrialization process in the 18th century would lead to its global hegemony in the 19th century and early 20th century (before the World War I). The contact between distant civilizations was highly facilitated as well as the mapping of a large part of the planet, with people in this historical period having a better understanding of the global map of the Planet Earth. According to historical statistics and research from the OECD, until the early modern period, Western Europe, China, and India accounted for roughly two thirds of the world's GDP. See also References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/NGC_2174] | [TOKENS: 337] |
Contents NGC 2174 NGC 2174 (also known as Monkey Head Nebula) is an H II emission nebula located in the constellation Orion and is associated with the open star cluster NGC 2175. It was discovered on 6 February 1877 by French astronomer Édouard Stephan. It is thought to be located about 6,400 light-years away from Earth. The nebula may have formed through hierarchical collapse. There is some equivocation in the use of the identifiers NGC 2174 and NGC 2175. These may apply to the entire nebula, to its brightest knot, or to the star cluster it includes. Burnham's Celestial Handbook lists the entire nebula as 2174/2175 and does not mention the star cluster. The NGC Project (working from the original descriptive notes) assigns NGC 2174 to the prominent knot at J2000 06h 09m 23.7s, +20° 39′ 34″ and NGC 2175 to the entire nebula, and by extension to the star cluster. SIMBAD uses NGC 2174 for the nebula and NGC 2175 for the star cluster. Glowing gas and dark dust do not survive well in the Monkey Head Nebula. Young stars near the center of the nebula generate stellar winds and high energy radiation that causes the nebula's material to shift into complex shapes. The nebula is primarily composed of hydrogen which glows at infrared wavelengths due to the radiation. Hubble 24th Anniversary Images (2014) Gallery See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/File:Differenceengine.jpg] | [TOKENS: 90] |
File:Differenceengine.jpg Summary Licensing File history Click on a date/time to view the file as it appeared at that time. File usage The following page uses this file: Global file usage The following other wikis use this file: Metadata This file contains additional information, probably added from the digital camera or scanner used to create or digitize it. If the file has been modified from its original state, some details may not fully reflect the modified file. |
======================================== |
[SOURCE: https://github.com/enterprise] | [TOKENS: 4336] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. The AI-powered developer platform for the agent-ready enterprise Bring your DevOps together on one secure platform built for speed, scale, and the agent-driven future of software. /enterprise Section - logo Enterprise-grade by design A centrally governed foundation that provides the control and visibility you need to innovate securely at scale. Security built into every stage of the software lifecycle. GitHub integrates automated, developer-first security that keeps teams moving fast. Built for your most valuable asset:your developers GitHub transforms your engineering team into a high-performing, AI-powered force for innovation and growth. Bring every stage of the development lifecycle together on one secure platform. Go beyond code completion with AI that improves quality and problem-solving and fuels innovation. Skip the ramp-up and accelerate impact with the platform trusted by over 180 million developers. Tap into our ecosystem of apps, actions, and models to accelerate innovation. From reactive administration to strategic platform leadership. Take control with centralized governance and automation that scales with your enterprise. Adopted by the world's leading organizations Whether you’re a startup or Fortune 500, GitHub Enterprise gives you everything you need to innovate securely on the platform developers love. Get the most out of GitHub Enterprise GitHub Enterprise is an enterprise-grade software development platform designed for the complex workflows of modern development. As an extensible platform solution, GitHub Enterprise enables organizations to seamlessly integrate additional tools and functionalities, tailoring their development environment to meet specific needs and enhancing overall productivity. There are several reasons why organizations should consider using GitHub Enterprise: Accelerate development at scale with AI-powered development: GitHub is the world’s most widely adopted Copilot-powered developer platform helping organizations build, secure, and deliver innovative software at scale. Application security made simpler: Native security tools embedded into the developer workflow, such as GitHub Advanced Security, help developers easily fix security issues, while providing more visibility and controls. Centralize governance and compliance: Customers can access a range of administration features to help manage governance at scale and enforce business rules and policies to meet their specific needs. Boost productivity and collaboration: Increase productivity with automated CI/CD workflows using GitHub Actions, collaborate effectively with GitHub Projects and GitHub Issues, manage hosted packages with GitHub Packages, and utilize prebuilt and configured development environments with GitHub Codespaces. Greater flexibility and control over data: Whether self-hosting with GitHub Enterprise Server or using GitHub Enterprise Cloud, GitHub provides customers with flexibility and control over their data. And now with GitHub Enterprise Cloud with data residency, customers have enhanced control where certain data, like their code, resides. Start a free 30 day trial today or contact our sales team for more information. GitHub Enterprise is used by organizations of all sizes that require greater productivity, collaboration, and security capabilities for their software development process. GitHub Enterprise can scale with teams, all the way from a small startup to a large corporation. GitHub Enterprise Cloud is the cloud-based solution of GitHub Enterprise, hosted on GitHub’s servers. This eliminates the need for organizations to maintain their own servers, infrastructure, and updates, allowing them to focus on development. In addition to the core productivity and collaboration features it provides, GitHub Enterprise Cloud provides access to additional features and add-ons for security, support, managed users, and many more. Customers can easily add or remove users as needed, and they can also increase storage capacity or processing power as their needs change. And for customers desiring more control over their data, GitHub Enterprise Cloud with data residency provides improved enterprise-grade features and more control over where code is stored. Start a free 30 day trial today or contact our sales team for more information. GitHub Enterprise Server is the self-hosted version of GitHub Enterprise. It is installed on-premises or on a private cloud and provides organizations with a secure and customizable source code management and collaboration platform. One of the key advantages of GitHub Enterprise Server is that it provides organizations with complete control over their source code and data. Organizations can choose where to store their repositories and can control who has access to them. Administrators can also customize the platform to meet specific needs, such as integrating other tools or implementing custom workflows. GitHub Enterprise Server also offers enhanced security and compliance features. Organizations can configure their instance to meet their specific security requirements, such as using LDAP or SAML for authentication, setting up two-factor authentication, or implementing network security measures. Compliance features are also included, such as audit logs, access controls, and vulnerability scanning. GitHub Enterprise is designed with security in mind and includes a range of features to help organizations protect their code and data. Here are some of the key security features that GitHub Enterprise offers: Authentication and access controls: GitHub Enterprise includes two-factor authentication, LDAP and Active Directory integration, and OAuth authentication. This helps organizations ensure that only authorized users can access their repositories and data. Encryption: All data in transit between the user's computer and GitHub Enterprise server is encrypted using HTTPS. All data at rest uses AES-256 encryption. Vulnerability scanning: GitHub Enterprise includes built-in security scanning features that can detect known vulnerabilities and alert users. Audit logs: The platform provides detailed audit logs that record all user actions, including repository access, changes, and deletions. This helps organizations track and monitor user activity and identify potential security issues. Customizable policies: GitHub Enterprise allows organizations to create custom policies for repository access. This can help enforce compliance requirements and prevent unauthorized access to sensitive data. Regular security updates: There is also a dedicated security team that provides regular updates, monitors for potential security threats, and responds quickly to any issues that arise. No, GitHub Enterprise is not free. It is a paid product that can be paid for either as a metered service on a monthly basis or as a subscription, with the cost determined by the number of users and the level of support required. For organizations interested in trying out the platform before making a commitment, GitHub Enterprise offers a free trial. Furthermore, organizations can contact the GitHub Sales team for the option to request a custom quote to meet their specific needs. Developers can collaborate with GitHub Enterprise using a variety of tools that are built into the platform, including: Pull requests: Allows developers to propose changes to a repository and submit them for review. Other team members can review the changes, leave comments, and suggest further improvements. GitHub Projects: Enables developers to track issues, assign tasks, and prioritize work. This helps teams stay on track, identify and resolve issues quickly, and ensure that everyone is working towards the same goals. GitHub Discussions: Empowers developers to have conversations about specific topics. This can be particularly useful for triaging complex issues or making decisions about the direction of a project. To get started with GitHub Enterprise, try a free trial today or contact our sales team. GitHub Enterprise offers several plans that vary in price and features. They are designed to accommodate different types of organizations and teams, from small startups to large enterprises. These plans include: GitHub Enterprise Server: This is the self-hosted version of GitHub Enterprise. It is installed on-premises or on a private cloud, and offers all the features of the cloud-based version of GitHub, including pull requests, code reviews, and project management tools. Pricing depends on the number of users and support requirements. GitHub Enterprise Cloud: This is the cloud-based version of GitHub Enterprise. It is hosted on GitHub's servers, and it offers all the features of GitHub Enterprise Server. The price depends on the number of users and storage requirements. For more information on cost, please see our pricing page. A DevOps platform is a set of tools, technologies, and practices that enable software development and IT operations teams to collaborate and automate the software delivery process. It typically includes version control, continuous integration and continuous delivery (CI/CD), automated testing, deployment automation, and monitoring. The main goal of a DevOps platform is to provide a single environment for software development and IT operations teams. By automating the software delivery process, a DevOps platform helps organizations reduce the time and cost of delivering software, while also improving the reliability, security, and scalability of their applications. Developer experience (DevEx) refers to the overall experience that software developers have when using development tools, frameworks, and platforms to create software applications. It encompasses all aspects of a developer's interaction with the tools, including onboarding, maintaining, ease of use, and productivity. The goal of optimizing DevEx is to make it as easy as possible for developers to create high-quality software quickly. This can involve designing tools with intuitive interfaces, providing clear and concise documentation, seamlessly integrating tools into workflows, and offering comprehensive support to help developers overcome challenges and obstacles. By prioritizing DevEx, organizations can improve the speed and quality of their software development processes, increase developer satisfaction and retention, and ultimately deliver better products. A software development platform is a set of tools, technologies, and resources that enable software developers to create, test, deploy, and maintain software applications. This typically includes a programming language or framework, an integrated development environment (IDE), libraries, code repositories, debugging and testing tools, and deployment and hosting options. The goal of a software development platform is to provide developers with a comprehensive set of tools and resources that make it easier to develop high-quality software. By providing an integrated environment for software development, a software development platform can help developers streamline their workflows, reduce errors, and improve the speed and quality of their work. Additionally, many software development platforms also provide access to a community of developers who can offer support, advice, and resources for improving software development practices. An application development platform is a set of tools that enables developers to build, deploy, and manage custom software applications. This kind of platform typically includes a programming language, software development kits (SDKs), application programming interfaces (APIs), libraries, and testing and debugging tools. These tools are designed to make it easier for developers to create and deploy custom applications for a specific platform, such as a mobile device or web browser. The goal of an application development platform is to provide developers with a comprehensive set of tools that makes it easier to create high-quality applications that meet the specific requirements of a particular platform or device. Software development collaboration is the process of working together as a team to create, test, and deploy software applications. It can involve a range of activities, such as brainstorming, planning, code reviews, testing, and deployment. Collaboration is an essential component of the software development process, as it allows multiple developers and stakeholders to work together. Effective collaboration requires open communication, clear goals and objectives, shared resources, and a commitment to working together as a team. Collaboration tools such as version control systems, collaborative coding environments, and project management software, can also provide a centralized location for team members to share information, coordinate tasks, and track progress. Ultimately, software development collaboration is essential to creating high-quality software that’s reliable, scalable, and meets the needs of end-users and stakeholders. GitHub Enterprise is an enterprise-grade software development platform designed for the complex workflows of modern development. As an extensible platform solution, GitHub Enterprise enables organizations to seamlessly integrate additional tools and functionalities, tailoring their development environment to meet specific needs and enhancing overall productivity. There are several reasons why organizations should consider using GitHub Enterprise: Accelerate development at scale with AI-powered development: GitHub is the world’s most widely adopted Copilot-powered developer platform helping organizations build, secure, and deliver innovative software at scale. Application security made simpler: Native security tools embedded into the developer workflow, such as GitHub Advanced Security, help developers easily fix security issues, while providing more visibility and controls. Centralize governance and compliance: Customers can access a range of administration features to help manage governance at scale and enforce business rules and policies to meet their specific needs. Boost productivity and collaboration: Increase productivity with automated CI/CD workflows using GitHub Actions, collaborate effectively with GitHub Projects and GitHub Issues, manage hosted packages with GitHub Packages, and utilize prebuilt and configured development environments with GitHub Codespaces. Greater flexibility and control over data: Whether self-hosting with GitHub Enterprise Server or using GitHub Enterprise Cloud, GitHub provides customers with flexibility and control over their data. And now with GitHub Enterprise Cloud with data residency, customers have enhanced control where certain data, like their code, resides. Start a free 30 day trial today or contact our sales team for more information. GitHub Enterprise is used by organizations of all sizes that require greater productivity, collaboration, and security capabilities for their software development process. GitHub Enterprise can scale with teams, all the way from a small startup to a large corporation. GitHub Enterprise Cloud is the cloud-based solution of GitHub Enterprise, hosted on GitHub’s servers. This eliminates the need for organizations to maintain their own servers, infrastructure, and updates, allowing them to focus on development. In addition to the core productivity and collaboration features it provides, GitHub Enterprise Cloud provides access to additional features and add-ons for security, support, managed users, and many more. Customers can easily add or remove users as needed, and they can also increase storage capacity or processing power as their needs change. And for customers desiring more control over their data, GitHub Enterprise Cloud with data residency provides improved enterprise-grade features and more control over where code is stored. Start a free 30 day trial today or contact our sales team for more information. GitHub Enterprise Server is the self-hosted version of GitHub Enterprise. It is installed on-premises or on a private cloud and provides organizations with a secure and customizable source code management and collaboration platform. One of the key advantages of GitHub Enterprise Server is that it provides organizations with complete control over their source code and data. Organizations can choose where to store their repositories and can control who has access to them. Administrators can also customize the platform to meet specific needs, such as integrating other tools or implementing custom workflows. GitHub Enterprise Server also offers enhanced security and compliance features. Organizations can configure their instance to meet their specific security requirements, such as using LDAP or SAML for authentication, setting up two-factor authentication, or implementing network security measures. Compliance features are also included, such as audit logs, access controls, and vulnerability scanning. GitHub Enterprise is designed with security in mind and includes a range of features to help organizations protect their code and data. Here are some of the key security features that GitHub Enterprise offers: Authentication and access controls: GitHub Enterprise includes two-factor authentication, LDAP and Active Directory integration, and OAuth authentication. This helps organizations ensure that only authorized users can access their repositories and data. Encryption: All data in transit between the user's computer and GitHub Enterprise server is encrypted using HTTPS. All data at rest uses AES-256 encryption. Vulnerability scanning: GitHub Enterprise includes built-in security scanning features that can detect known vulnerabilities and alert users. Audit logs: The platform provides detailed audit logs that record all user actions, including repository access, changes, and deletions. This helps organizations track and monitor user activity and identify potential security issues. Customizable policies: GitHub Enterprise allows organizations to create custom policies for repository access. This can help enforce compliance requirements and prevent unauthorized access to sensitive data. Regular security updates: There is also a dedicated security team that provides regular updates, monitors for potential security threats, and responds quickly to any issues that arise. No, GitHub Enterprise is not free. It is a paid product that can be paid for either as a metered service on a monthly basis or as a subscription, with the cost determined by the number of users and the level of support required. For organizations interested in trying out the platform before making a commitment, GitHub Enterprise offers a free trial. Furthermore, organizations can contact the GitHub Sales team for the option to request a custom quote to meet their specific needs. Developers can collaborate with GitHub Enterprise using a variety of tools that are built into the platform, including: Pull requests: Allows developers to propose changes to a repository and submit them for review. Other team members can review the changes, leave comments, and suggest further improvements. GitHub Projects: Enables developers to track issues, assign tasks, and prioritize work. This helps teams stay on track, identify and resolve issues quickly, and ensure that everyone is working towards the same goals. GitHub Discussions: Empowers developers to have conversations about specific topics. This can be particularly useful for triaging complex issues or making decisions about the direction of a project. To get started with GitHub Enterprise, try a free trial today or contact our sales team. GitHub Enterprise offers several plans that vary in price and features. They are designed to accommodate different types of organizations and teams, from small startups to large enterprises. These plans include: GitHub Enterprise Server: This is the self-hosted version of GitHub Enterprise. It is installed on-premises or on a private cloud, and offers all the features of the cloud-based version of GitHub, including pull requests, code reviews, and project management tools. Pricing depends on the number of users and support requirements. GitHub Enterprise Cloud: This is the cloud-based version of GitHub Enterprise. It is hosted on GitHub's servers, and it offers all the features of GitHub Enterprise Server. The price depends on the number of users and storage requirements. For more information on cost, please see our pricing page. A DevOps platform is a set of tools, technologies, and practices that enable software development and IT operations teams to collaborate and automate the software delivery process. It typically includes version control, continuous integration and continuous delivery (CI/CD), automated testing, deployment automation, and monitoring. The main goal of a DevOps platform is to provide a single environment for software development and IT operations teams. By automating the software delivery process, a DevOps platform helps organizations reduce the time and cost of delivering software, while also improving the reliability, security, and scalability of their applications. Developer experience (DevEx) refers to the overall experience that software developers have when using development tools, frameworks, and platforms to create software applications. It encompasses all aspects of a developer's interaction with the tools, including onboarding, maintaining, ease of use, and productivity. The goal of optimizing DevEx is to make it as easy as possible for developers to create high-quality software quickly. This can involve designing tools with intuitive interfaces, providing clear and concise documentation, seamlessly integrating tools into workflows, and offering comprehensive support to help developers overcome challenges and obstacles. By prioritizing DevEx, organizations can improve the speed and quality of their software development processes, increase developer satisfaction and retention, and ultimately deliver better products. A software development platform is a set of tools, technologies, and resources that enable software developers to create, test, deploy, and maintain software applications. This typically includes a programming language or framework, an integrated development environment (IDE), libraries, code repositories, debugging and testing tools, and deployment and hosting options. The goal of a software development platform is to provide developers with a comprehensive set of tools and resources that make it easier to develop high-quality software. By providing an integrated environment for software development, a software development platform can help developers streamline their workflows, reduce errors, and improve the speed and quality of their work. Additionally, many software development platforms also provide access to a community of developers who can offer support, advice, and resources for improving software development practices. An application development platform is a set of tools that enables developers to build, deploy, and manage custom software applications. This kind of platform typically includes a programming language, software development kits (SDKs), application programming interfaces (APIs), libraries, and testing and debugging tools. These tools are designed to make it easier for developers to create and deploy custom applications for a specific platform, such as a mobile device or web browser. The goal of an application development platform is to provide developers with a comprehensive set of tools that makes it easier to create high-quality applications that meet the specific requirements of a particular platform or device. Software development collaboration is the process of working together as a team to create, test, and deploy software applications. It can involve a range of activities, such as brainstorming, planning, code reviews, testing, and deployment. Collaboration is an essential component of the software development process, as it allows multiple developers and stakeholders to work together. Effective collaboration requires open communication, clear goals and objectives, shared resources, and a commitment to working together as a team. Collaboration tools such as version control systems, collaborative coding environments, and project management software, can also provide a centralized location for team members to share information, coordinate tasks, and track progress. Ultimately, software development collaboration is essential to creating high-quality software that’s reliable, scalable, and meets the needs of end-users and stakeholders. Footnotes The Total Economic Impact™ Of GitHub Enterprise Cloud, a commissioned study conducted by Forrester Consulting, 2025. Results are for a composite organization based on interviewed customers. Forrester Wave™: DevOps Platforms, Q2 2025. Forrester does not endorse any company, product, brand, or service included in its research publications and does not advise any person to select the products or services of any company or brand based on the ratings included in such publications. Information is based on the best available resources. Opinions reflect judgment at the time and are subject to change. For more information, read about Forrester’s objectivity here. Site-wide Links Get tips, technical guides, and best practices. Twice a month. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Graph_(abstract_data_type)] | [TOKENS: 1380] |
Contents Graph (abstract data type) In computer science, a graph is an abstract data type that is meant to implement the undirected graph and directed graph concepts from the field of graph theory within mathematics. A graph data structure consists of a finite (and possibly mutable) set of vertices (also called nodes or points), together with a set of unordered pairs of these vertices for an undirected graph or a set of ordered pairs for a directed graph. These pairs are known as edges (also called links or lines), and for a directed graph are also known as edges but also sometimes arrows or arcs. The vertices may be part of the graph structure, or may be external entities represented by integer indices or references. A graph data structure may also associate to each edge some edge value, such as a symbolic label or a numeric attribute (cost, capacity, length, etc.). Operations The basic operations provided by a graph data structure G usually include: Structures that associate values to the edges usually also provide: Common data structures for graph representation The following table gives the time complexity cost of performing various operations on graphs, for each of these representations, with |V| the number of vertices and |E| the number of edges.[citation needed] In the matrix representations, the entries encode the cost of following an edge. The cost of edges that are not present are assumed to be ∞. Adjacency lists are generally preferred for the representation of sparse graphs, while an adjacency matrix is preferred if the graph is dense; that is, the number of edges | E | {\displaystyle |E|} is close to the number of vertices squared, | V | 2 {\displaystyle |V|^{2}} , or if one must be able to quickly look up if there is an edge connecting two vertices. The time complexity of operations in the adjacency list representation can be improved by storing the sets of adjacent vertices in more efficient data structures, such as hash tables or balanced binary search trees (the latter representation requires that vertices are identified by elements of a linearly ordered set, such as integers or character strings). A representation of adjacent vertices via hash tables leads to an amortized average time complexity of O ( 1 ) {\displaystyle O(1)} to test adjacency of two given vertices and to remove an edge and an amortized average time complexity of O ( deg ( x ) ) {\displaystyle O(\deg(x))} to remove a given vertex x of degree deg ( x ) {\displaystyle \deg(x)} . The time complexity of the other operations and the asymptotic space requirement do not change. Parallel representations The parallelization of graph problems faces significant challenges: Data-driven computations, unstructured problems, poor locality and high data access to computation ratio. The graph representation used for parallel architectures plays a significant role in facing those challenges. Poorly chosen representations may unnecessarily drive up the communication cost of the algorithm, which will decrease its scalability. In the following, shared and distributed memory architectures are considered. In the case of a shared memory model, the graph representations used for parallel processing are the same as in the sequential case, since parallel read-only access to the graph representation (e.g. an adjacency list) is efficient in shared memory. In the distributed memory model, the usual approach is to partition the vertex set V {\displaystyle V} of the graph into p {\displaystyle p} sets V 0 , … , V p − 1 {\displaystyle V_{0},\dots ,V_{p-1}} . Here, p {\displaystyle p} is the amount of available processing elements (PE). The vertex set partitions are then distributed to the PEs with matching index, additionally to the corresponding edges. Every PE has its own subgraph representation, where edges with an endpoint in another partition require special attention. For standard communication interfaces like MPI, the ID of the PE owning the other endpoint has to be identifiable. During computation in a distributed graph algorithms, passing information along these edges implies communication. Partitioning the graph needs to be done carefully - there is a trade-off between low communication and even size partitioning But partitioning a graph is a NP-hard problem, so it is not feasible to calculate them. Instead, the following heuristics are used. 1D partitioning: Every processor gets n / p {\displaystyle n/p} vertices and the corresponding outgoing edges. This can be understood as a row-wise or column-wise decomposition of the adjacency matrix. For algorithms operating on this representation, this requires an All-to-All communication step as well as O ( m ) {\displaystyle {\mathcal {O}}(m)} message buffer sizes, as each PE potentially has outgoing edges to every other PE. 2D partitioning: Every processor gets a submatrix of the adjacency matrix. Assume the processors are aligned in a rectangle p = p r × p c {\displaystyle p=p_{r}\times p_{c}} , where p r {\displaystyle p_{r}} and p c {\displaystyle p_{c}} are the amount of processing elements in each row and column, respectively. Then each processor gets a submatrix of the adjacency matrix of dimension ( n / p r ) × ( n / p c ) {\displaystyle (n/p_{r})\times (n/p_{c})} . This can be visualized as a checkerboard pattern in a matrix. Therefore, each processing unit can only have outgoing edges to PEs in the same row and column. This bounds the amount of communication partners for each PE to p r + p c − 1 {\displaystyle p_{r}+p_{c}-1} out of p = p r × p c {\displaystyle p=p_{r}\times p_{c}} possible ones. Compressed representations Graphs with trillions of edges occur in machine learning, social network analysis, and other areas. Compressed graph representations have been developed to reduce I/O and memory requirements. General techniques such as Huffman coding are applicable, but the adjacency list or adjacency matrix can be processed in specific ways to increase efficiency. Applications of Graphs Breadth-first search (BFS) and depth-first search (DFS) are two closely related approaches that are used for exploring all of the nodes in a given connected component. Both start with an arbitrary node, the "root". Strongly connected components can also be found using graph traversals using algorithms such as Kosaraju's algorithm, which is a modified DFS. Dijkstra's Algorithm is a Pathfinding Algorithm that can be used on a positively-weighted (meaning all edge weights must be greater than or equal to 0) and/or directed graphs. This can be used to find the shortest path between two arbitrarily chosen nodes which is commonly applied in routing problems. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Euler_(programming_language)] | [TOKENS: 291] |
Contents Euler (programming language) Euler is a programming language created by Niklaus Wirth and Helmut Weber, conceived as an extension and generalization of ALGOL 60. The designers' goals were to create a language that is: Available sources indicate that Euler was operational by 1965. Overview Euler employs a general data type concept. In Euler, arrays, procedures, and switches are not quantities which are declared and named by identifiers: in contrast to ALGOL, they are not quantities on the same level as variables. Rather, these quantities are on the level of numeric and boolean constants. Thus, besides the traditional numeric and logical constants, Euler introduces several added types: All constants can be assigned to variables, which have the same form as in ALGOL, but for which no fixed types are specified: Euler uses dynamic typing. Further, a procedure can produce a value of any type when executed, and this type can vary from one call of the procedure to the next. Similarly, the elements of a list can have values of any type and these can differ from element to element within a list. So, when the list elements are labels, a switch is obtained. If the elements are procedures, a procedure list is obtained, which is unavailable in ALGOL 60. If the elements are lists themselves, then a general tree structure is obtained. Euler provides general type-test and type-conversion operators. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Intellectual_Ventures] | [TOKENS: 1521] |
Contents Intellectual Ventures Intellectual Ventures is an American private equity company that centers on the development and licensing of intellectual property. Intellectual Ventures is one of the top-five owners of U.S. patents, as of 2011. [verification needed] Its business model focuses on buying patents and aggregating those patents into a large patent portfolio and licensing these patents to third parties. The company has been described as the country's largest and most notorious patent trolling company, the ultimate patent troll, and the most hated company in tech. In 2009, the firm launched a prototyping and research laboratory, Intellectual Ventures Lab, which attracted media controversy when the book SuperFreakonomics described its ideas for reducing global climate change. The firm also collaborates on humanitarian projects through its Global Good program. Overview In 2000, Intellectual Ventures was founded as a private partnership by Nathan Myhrvold and Edward Jung of Microsoft, later joined by co-founders Peter Detkin of Intel and Gregory Gorder of Perkins Coie. The Intellectual Ventures Management Company is owned 40% Nathan Myhrvold, 20% Peter Detkin, 20% Gregory Gorder and 20% Edward Jung. They reportedly have raised over $5.5 billion from many large companies including Microsoft, Intel, Sony, Nokia, Apple, Google, Yahoo, American Express, Adobe, SAP, Nvidia, and eBay, plus investment firms such as Stanford, Hewlett Foundation, Mayo Clinic, and Charles River Ventures. In December 2013, the firm released a list of approximately 33,000 of the nearly 40,000 assets in their monetization program. Licenses to patents are obtained through investment and royalties. In March 2009, the firm announced expansion into China, India, Japan, Korea and Singapore to build partnerships with scientists and institutions in Asia.[citation needed] Investment funds The company operates three primary investment funds: Intellectual Ventures Lab In 2009, Intellectual Ventures launched a prototyping and research laboratory, Intellectual Ventures Lab, hiring scientists to imagine inventions which could exist but do not yet exist, and then filing descriptions of these inventions with the US Patent Office. Notable participants include Robert Langer of MIT, Leroy Hood of the Institute for Systems Biology, Ed Harlow of Harvard Medical School, Bran Ferren and Danny Hillis of Applied Minds, and Sir John Pendry of Imperial College.[citation needed] The Sunday Times reported that the company applies for about 450 patents per year, in areas from vaccine research to optical computing and, as of May 2010, 91 of the applications had been approved. Internally developed inventions include a safer nuclear reactor design (which won the MIT Technology Review Top 10 Emerging Technologies in 2009) that can use uranium waste as fuel or thorium which is plentiful and poses no proliferation risk, a mosquito-targeting laser, and a series of computer models of infectious disease. Their efforts to promote a method to reverse or reduce the effects of global climate change by artificially recreating the conditions from the aftermath of a volcanic eruption gained media coverage following the release of the book SuperFreakonomics, whose chapter about global warming proposes that the global climate can be regulated by geo-engineering of a stratoshield based upon patented technology from the company. The chapter has been criticized by some economists and climate science experts who say it contains numerous misleading statements and discredited arguments, including this presentation of geoengineering as a replacement for CO2 emissions reduction. Among the critics are Paul Krugman, Brad DeLong, The Guardian, and The Economist. Elizabeth Kolbert, a science writer for The New Yorker who has written extensively on global warming, contends that "just about everything they [Levitt and Dubner] have to say on the topic is, factually speaking, wrong." In response, Levitt and Dubner have stated on their Freakonomics blog that global warming is man-made and an important issue. They warn against claims of an inevitable doomsday; instead they look to raise awareness of less traditional or popular methods to tackle the potential problem of global warming. Lowell Wood, an "inventor in residence" at Intellectual Ventures, became the most-patented inventor in US history in 2015, breaking the record held by Thomas Edison for over 80 years. Global Good Global Good was a not-for-profit collaboration between the firm and the Gates family, to develop solutions for pressing problems in the developing world. Its technologies included: In mid-2020, Global Good was dismantled, with some of its components (most notably the Institute for Disease Modeling) transitioning into the Gates Foundation, and some evolving into new entities under Gates Ventures. Companies created Intellectual Ventures has created a number of independent companies to bring its discoveries to mass market. Examples include Kymeta, a satellite technology company, TerraPower, which seeks to improve nuclear power, Evolv, which applies metamaterials to imaging, and Echodyne, a metamaterials-based radar communications company. Controversy Publicly, Intellectual Ventures states that a major goal is to assist small inventors against corporations. In practice, the vast majority of IV's revenue comes from buying patents, aggregating these patents into a single portfolio spanning many disparate technologies and tying these patents together for license to other companies under the threat of litigation, or filing lawsuits for infringement of patents, a controversial practice referred to as "patent trolling." Intellectual Ventures' purchased patents have largely been kept secret, though press releases with Telcordia and Transmeta indicated some or all of their patent portfolios were sold to the company. It reports that its purchasing activity as of spring 2010 has sent $350 million to individual inventors, and $848 million to small and medium size enterprises as well as returning "approximately $1 billion" to investors before filing any lawsuits, but IV's assistance to individual inventors has been contested. Investigative journalism suggests that the company makes most of its income from lawsuits and licensing of already-existing inventions, rather than from its own innovation. Intellectual Ventures has been described as a "patent troll" by Shane Robison, CTO of Hewlett-Packard and others, allegedly accumulating patents not in order to develop products around them but with the goal to pressure large companies into paying licensing fees. Recent reports indicate that Verizon and Cisco made payments of $200 million to $400 million for investment and licenses to the Intellectual Ventures portfolio. On December 8, 2010, in its 10th year of operations, Intellectual Ventures filed its first lawsuit, accusing Check Point, McAfee, Symantec, Trend Micro, Elpida, Hynix, Altera, Lattice and Microsemi of patent infringement. In September 2016, the Court of Appeals for the Federal Circuit ruled that all the relevant patent claims in the lawsuit were invalid, because "the patent merely applies a well-known idea using generic computers". The company has been accused of hiding behind shell companies for earlier lawsuits, an accusation consistent with the findings of NPR's Planet Money in July 2011. The episode, which also aired as the This American Life episode "When Patents Attack", was dedicated to software patents, prominently featuring Intellectual Ventures. It includes sources accusing Intellectual Ventures of pursuing a strategy encouraging mutually assured destruction, including Chris Sacca calling Myhrvold's argument that Intellectual Ventures is offering protection from lawsuits in a "mafia-style shakedown". Intellectual Ventures staff are active in lobbying and testifying in court on United States patent policy. References External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.