text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-Verge_20250203-171] | [TOKENS: 8773]
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Origins_of_the_War_of_1812] | [TOKENS: 4867]
Contents Origins of the War of 1812 The origins of the War of 1812 (1812–1815), between the United States and the United Kingdom's British Empire and their First Nation allies, have been long debated. Multiple factors led to the US declaration of war on Great Britain that began the War of 1812: American expansion into the Northwest Territory (now Ohio, Indiana, Michigan, Illinois, Wisconsin, and northeast Minnesota) was impeded by Native American raids. Some historians maintain that an American goal in the war was to annex some or all of Canada, a view many Canadians still share.[citation needed] However, many argue that inducing the fear of such a seizure was merely an American tactic, which was designed to obtain a bargaining chip. Some members of the British Parliament and dissident American politicians such as John Randolph of Roanoke claimed that American expansionism, rather than maritime disputes, was the primary motivation for the American declaration of war. That view has been retained by some historians. Although the British made some concessions before the war on neutral trade, they insisted on the right to reclaim their deserting sailors. The British also had long had a goal to create a large "neutral" Native American state that would cover much of Ohio, Indiana, and Michigan. They made the demand as late as 1814 at the Ghent Peace Conference, but they lost battles that would have upheld those claims. British goals The UK's British Empire was engaged in a life-and-death war against Napoleon Bonaparte's France and believed it could not allow the Americans to help the enemy through trade, regardless of their lawful rights as neutrals to do so. As Horsman explained, "If possible, England wished to avoid war with America, but not to the extent of allowing her to hinder the British war effort against France. Moreover... a large section of influential British opinion, both in the government and in the country, thought that America presented a threat to British maritime supremacy." According to Historian Andrew Lambert, the defense of British North America was Britain's primary goal: "The British had no interest in fighting this war, and once it had begun, they had one clear goal: keep the United States from taking any part of Canada". Britain's policy was to effect the end of the war, through continuous campaigning, which would influence the people of the United States and government policy. All parties were committed to the defeat of France, which required sailors and thus impressment, as well as all-out commercial war against France, which caused the restrictions that were imposed on American merchant ships. On the question of trade with America, the British parties split. As Horsman argues, "Some restrictions on neutral commerce were essential for England in this period. That this restriction took such an extreme form after 1807 stemmed, not only from the effort to defeat Napoleon, but also from the undoubted jealousy of America's commercial prosperity that existed in England. America was unfortunate in that, for most of the period from 1803 to 1812, political power in England was held by a group that was pledged not only to the defeat of France, but also to a rigid maintenance of Britain's commercial supremacy." That group was weakened by Whigs, friendly to the US in mid-1812, and the policies were reversed, although the US had already declared war. By 1815, Britain was no longer controlled by politicians dedicated to commercial supremacy and so that cause had vanished. The British were hindered by weakened diplomats in Washington, such as David Erskine, who were unable to represent a consistent British policy, and by communications that were so slow that the Americans did not learn of the reversal of policy until they had declared war. Americans proposed a truce based on the British ending impressment, but the latter refused because they believed they needed those sailors. Horsman explained, "Impressment, which was the main point of contention between England and America from 1803 to 1807, was made necessary, primarily, because of England's great shortage of seamen for the war against Napoleon. In a similar manner, the restrictions on American commerce imposed by England's Orders in Council, which were the supreme cause of complaint between 1807 and 1812, were one part of a vast commercial struggle being waged between England and France." The British also had the long-standing goal of creating an Indian barrier state, a large "neutral" Indian state that would cover most of the Old Northwest to be a barrier between the Western US and Canada. It would be independent of the US and under the tutelage of the British, who would use it to block American expansionism and to build up their control of the fur trade. The British continued to make that demand as late as 1814, during the Ghent Peace Conference. However, they dropped the demand since their position had been weakened by the collapse of Tecumseh's Confederacy after the Battle of the Thames. Also, they simply no longer considered the goal to be worth war against the US, although much of the proposed buffer state had remained largely under British and Indian control throughout the war. However, Britain insisted on including the right for Indians to return to lands they had lost after 1811, which was included in clause IX, even though Britain had doubts that this would be upheld by America. American goals There were several immediate stated causes for the American declaration of war: Indians based in the Northwest Territory, now the states of Ohio, Indiana, Illinois, Michigan, and Wisconsin, had organized in opposition to American settlement and were being supplied with weapons by British traders in Canada. Britain was not trying to provoke a war and, at one point, cut its allocations of gunpowder to the tribes, but it was trying to build up its fur trade and friendly relations with potential military allies. Britain had ceded the area to the United States in the Treaty of Paris (1783) but had the long-term goal of creating a "neutral" or buffer Indian state in the area to block further American growth. The Indian nations generally followed Tenskwatawa, the Shawnee Prophet and the brother of Tecumseh. Since 1805, he had preached his vision of purifying his society by expelling the "Children of the Evil Spirit" (the American settlers). According to Pratt, There is ample proof that the British authorities did all in their power to hold or win the allegiance of the Indians of the Northwest with the expectation of using them as allies in the event of war. Indian allegiance could be held only by gifts, and, to an Indian, no gift was as acceptable as a lethal weapon. Guns and ammunition, tomahawks and scalping knives were dealt out with some liberality by British agents. Raiding grew more common in 1810 and 1811. Westerners in Congress found the raids intolerable and wanted them to be permanently ended. Historians have considered the idea that American expansionism was one cause of the war. The American expansion into the Northwest Territory (now Ohio, Indiana, Illinois, Michigan, and Wisconsin) was being blocked by Indians, which was a major cause animating the Westerners. The American historian Walter Nugent, in his history of American expansionism, argues that expansion into the Midwest "was not the only American objective, and indeed not the immediate one area but it was an objective." More controversial is whether an American war goal was to acquire Canadian lands, especially what is now Western Ontario, permanently or whether it was planned to seize the area temporarily as a bargaining chip. The American desire for Canada has been a staple in Canadian public opinion since the 1830s and was much discussed among historians before 1940 but has since become less popular. The idea was first developed by historian Louis M. Hacker and refined by the diplomatic specialist Julius Pratt. In 1925, Pratt argued that Western Americans were incited to war by the prospect of seizing Canada. Pratt's argument supported the belief of many Canadians, especially in Ontario, where fear of American expansionism was a major political element, and the notion still survives among Canadians. In 2010, the American historian Alan Taylor examined the political dimension of the annexation issue as Congress debated whether to declare war in 1811 and 1812. The Federalist Party was strongly opposed to war and to annexation, as were the Northeastern states. The majority in Congress was held by the Democratic-Republican Party, which was split on the issue. One faction wanted the permanent expulsion of Britain and the annexation of Canada. John Randolph of Roanoke, representing Virginia, commented, "Agrarian greed, not maritime right, urges this war. We have heard but one word - like the whippoorwill's one monotonous tone: Canada! Canada! Canada!" The other faction, based in the South, said that acquiring new territory in the North would give it too much power and so opposed the incorporation of Canada since its Catholic population was viewed as "unfit by faith, language and illiteracy for republican citizenship." The Senate held a series of debates and twice voted on proposals that explicitly endorsed annexation, neither of which passed. However, the second failed only because of a proviso stating that Canada could be returned to British rule after it had been annexed. War was declared with no mention of annexation, but widespread support existed among the War Hawks for it. Some Southerners supported expansionism; Tennessee Senator Felix Grundy considered it essential to acquire Canada to preserve domestic political balance and argued that annexing Canada would maintain the free state-slave state balance, which might otherwise be ended by the acquisition of Florida and the settlement of the southern areas of the new Louisiana Purchase. Even James Monroe and Henry Clay, key officials in the government, expected to gain at least Upper Canada from a successful war. American commanders like General William Hull and Alexander Smyth issued proclamations to Canadians stating that the war was aimed at liberating them from British oppression and announcing an intention to annex the Canadas into the United States. Hull's proclamation also threatened them with "the horrors & calamities of war" and promised to kill out of hand any white found fighting with an Indian. Smythe wrote to his troops that when they entered Canada, "You enter a country that is to become one with the United States. You will arrive among a people who are to become your fellow-citizens." These proclamations echoed similar appeals made during the American Revolution, such as the Continental Army's Letters to the Inhabitants of Canada. Historians now generally agree that an invasion and seizure of Canada was the main American military strategy once the war had begun. With British control of the oceans, there was no other way to fight against British interests actively. President James Madison believed that food supplies from Canada were essential to the British Overseas Empire in the West Indies and that an American seizure would be an excellent bargaining chip at the peace conference. During the war, some Americans speculated that they might as well keep all of Canada. Thomas Jefferson, for example, was now out of power but argued that the expulsion of British interests from nearby Canada would remove a long-term threat to American republicanism. The New Zealander historian J.C.A. Stagg argued that Madison and his advisers believed that the conquest of Canada would be easy and that economic coercion would force the British to come to terms by cutting off the food supply for their highly-valuable West Indies sugar colonies. Furthermore, the possession of Canada would be a valuable bargaining chip. Stagg suggested that frontiersmen demanded the seizure of Canada not because they wanted the land, since they had plenty of it, but because the British were thought to be arming the Indians and thus blocked settlement of the West. Hickey flatly stated, "The desire to annex Canada did not bring on the war." Brown (1964) concluded, "The purpose of the Canadian expedition was to serve negotiation not to annex Canada." Alfred Leroy Burt, a Canadian scholar but also a professor at an American university, agreed completely by noting that Foster, the British minister to Washington, also rejected the argument that annexation of Canada was a war goal. However, Foster also rejected the possibility of a declaration of war but had dinner with several of the more prominent War Hawks and so his judgement on such matters can be questioned. However, Stagg stated that "had the War 1812 been a successful military venture, the Madison administration would have been reluctant to have returned occupied Canadian territory to the enemy." Other authors concur, with one stating, "Expansion was not the only American objective, and indeed not the immediate one. But it was an objective." "The American yearning to absorb Canada was long-standing.... In 1812 it became part of a grand strategy." Another suggested, "Americans harbored 'manifest destiny' ideas of Canadian annexation throughout the Nineteenth Century." A third stated, "The [American] belief that the United States would one day annex Canada had a continuous existence from the early days of the War of Independence to the War of 1812 [and] was a factor of primary importance in bringing on the war." Another stated that "acquiring Canada would satisfy America's expansionist desires". The historian Spencer Tucker wrote, "War Hawks were eager to wage war with the British, not only to end Indian depredations in the Midwest but also to seize Canada and perhaps Spanish Florida." Most of the inhabitants of Upper Canada (now Ontario) were Americans, but some of them were exiled United Empire Loyalists, and most of them were recent immigrants. The Loyalists were extremely hostile to American annexation, and the other settlers seem to have been uninterested and to have remained neutral during the war. The Canadian colonies were thinly populated and only lightly defended by the British Army, and some Americans believed that the many in Upper Canada would rise and greet the American invading army as liberators. The combination implied an easy conquest. Once the war began, ex-President Thomas Jefferson warned that the British presence posed a grave threat and pointed to "The infamous intrigues of Great Britain to destroy our government... and with the Indians to Tomahawk our women and children, prove that the cession of Canada, their fulcrum for these Machiavellian levers, must be a sine qua non at a treaty of peace." He predicted in late 1812 that "the acquisition of Canada this year, as far as the neighborhood of Quebec, will be a mere matter of marching, and will give us the experience for the attack on Halifax, the next and final expulsion of England from the American continent." Violations of U.S. rights The long wars between Britain and France (1793–1815) led to repeated complaints by the US that both powers violated American rights, as a neutral power, to trade with both sides. Furthermore, Americans complained loudly that British agents in Canada were supplying munitions to hostile Native American tribes living in US territories. In the mid-1790s, the Royal Navy, short of manpower, began to board American merchant ships to seize American and British sailors from American vessels. Although the policy of impressment was supposed to reclaim only British subjects, the law of Britain and most other countries defined nationality by birth. However, American law allowed individuals who had been resident in the country for some time to adopt US citizenship. Therefore, many individuals were British by British law but American by American law. The confusion was compounded by the refusal of Jefferson and Madison to issue any official citizenship documents. Their position was that all persons serving on American ships were to be regarded as US citizens and so no further evidence was required. That stance was motivated by the advice of Albert Gallatin, who had calculated that half of the US deep-sea merchant seamen (9,000 men) were British subjects. Allowing the Royal Navy to reclaim those men would destroy both the US economy and the government's vital customs revenue. Any sort of accommodation would jeopardize those men and so concords such as the proposed Monroe-Pinkney Treaty (1806) between the US and Britain were rejected by Jefferson. To fill the need for some sort of identification, US consuls provided unofficial papers. However, they relied on unverifiable declarations by the individual concerned for evidence of citizenship, and the large fees paid for the documents made them a lucrative sideline. In turn, British officers, who were short of personnel and convinced, somewhat reasonably, that the American flag was covering a large number of British deserters, tended to treat such papers with scorn. Between 1806 and 1812, about 6,000 seamen were impressed and taken against their will into the Royal Navy; 3,800 of them were later released. Honor A number of American contemporaries called it "the "Second War for Independence." Henry Clay and John C. Calhoun pushed a declaration of war through Congress by stressing the need to uphold American honor and independence. Speaking of his fellow Southerners, Calhoun told Congress that they The historian Norman Risjord emphasized the central importance of honor as a cause the war. Americans of every political stripe saw the need to uphold national honor and to reject the treatment of the United States by Britain as a third-class nonentity. Americans talked incessantly about the need for force in response. That quest for honor was a major cause of the war in the sense that most Americans who were not involved in mercantile interests or threatened by Indian attack strongly endorsed the preservation of national honor. The humiliating attack by HMS Leopard against USS Chesapeake in June 1807 was a decisive event. Many Americans called for war, but Jefferson held back and insisted that economic warfare would prove more successful, which he initiated, especially in the form of embargoing or refusing to sell products to Britain. The policy proved a failure by not deterring the British, but it seriously damaged American industry and alienated the mercantile cities of the Northeast, which were seriously hurt. Historians have demonstrated the powerful motive of honor to shape public opinion in a number of states, including Massachusetts, Ohio, Pennsylvania, Tennessee, and Virginia, as well as the territory of Michigan. On 3 June 1812, the House Committee on Foreign Affairs, chaired by the pro-war extremist John C. Calhoun, called for a declaration of war in ringing phrases by denouncing Britain's "lust for power," "unbounded tyranny," and "mad ambition." James Roark wrote, "These were fighting words in a war that was in large measure about insult and honor." Calhoun reaped much of the credit. In terms of honor, the conclusion of the war, especially the spectacular defeat of the main British invasion army at New Orleans, restored the American sense of honor. The historian Lance Banning wrote: According to J.C.A. Stagg, a historian from New Zealand, US economic motivations The failure of Jefferson's embargo and of Madison's economic coercion, according to Horsman, "made war or absolute submission to England the only alternatives, and the latter presented more terrors to the recent colonists. The war hawks came from the West and the South, regions that had supported economic warfare and were suffering the most from British restrictions at sea. The merchants of New England earned large profits from the wartime carrying trade, in spite of the numerous captures by both France and England, but the western and southern farmers, who looked longingly at the export market, were suffering a depression that made them demand war." Prewar incidents This dispute came to the forefront with the Chesapeake–Leopard affair of 1807, when the British warship HMS Leopard fired on and boarded the American warship USS Chesapeake, killed three, and carried off four deserters from the Royal Navy. (Only one was a British citizen and was later hanged; the other three were American citizens and were later returned but the last two only in 1812.) The American public was outraged by the incident, and many called for war to assert American sovereignty and national honor. The Chesapeake–Leopard affair followed closely on the similar Leander affair, which had resulted in Jefferson banning certain British warships and their captains from American ports and waters. Whether in response to that incident or the Chesapeake-Leopard affair, Jefferson banned all foreign armed vessels from American waters except for those bearing dispatches. In December 1808, an American officer expelled HMS Sandwich from Savannah, Georgia; the schooner had entered with dispatches for the British consul there. Meanwhile, Napoleon's Continental System and the British Orders in Council established embargoes that made international trade precarious. From 1807 to 1812, about 900 American ships were seized as a result. The US responded with the Embargo Act of 1807, which prohibited American ships from sailing to any foreign ports and closed American ports to British ships. Jefferson's embargo was especially unpopular in New England, whose merchants preferred the indignities of impressment to the halting of overseas commerce. The discontent contributed to the calling of the Hartford Convention in 1814. The Embargo Act had no effect on either Britain or France and so was replaced by the Non-Intercourse Act of 1809, which lifted all embargoes on American shipping except for those bound for British or French ports. As that proved to be unenforceable, it was replaced in 1810 by Macon's Bill Number 2, which lifted all embargoes but offered that if France or Britain ceased its interference with American shipping, the US would reinstate an embargo on the other nation. Napoleon, seeing an opportunity to make trouble for Britain, promised to leave American ships alone, and the US reinstated the embargo with Britain and moved closer to declaring war. However, he had no intention of honoring his promise. Exacerbating the situation, Sauk Indians, who controlled trade on the Upper Mississippi, were displeased with the US government after the 1804 treaty between Quashquame and William Henry Harrison ceded Sauk territory in Illinois and Missouri to the US. The Sauk felt the treaty to be unjust and that Quashquame had been unauthorized to sign away land and had been unaware of what he was signing. The establishment of Fort Madison in 1808 on the Mississippi had further angered the Sauk and led many, including Black Hawk, to side with the British before the war broke out. Sauk and allied Indians, including the Ho-Chunk (Winnebago), were very effective fighters for the British on the Mississippi and helped to defeat Fort Madison and Fort McKay in Prairie du Chien. The Oxford historian Paul Langford looked at the decisions by the British government in 1812: Declaration of war In the US House of Representatives, a group of young Democratic-Republicans, known as the "War Hawks", came to the forefront in 1811 and were led by Speaker Henry Clay of Kentucky and by John C. Calhoun of South Carolina. They advocated going to war against Britain for all of the reasons listed above but concentrated on their grievances more than on territorial expansion. On 1 June 1812, President James Madison gave a speech to the US Congress that recounted American grievances against Britain but did not specifically call for a declaration of war. After Madison's speech, the House of Representatives quickly voted (79 to 49) to declare war, and the Senate did the same by 19 to 13. The conflict formally began on 18 June 1812, when Madison signed the measure into law. It was the first time that the US had declared war on another nation, and the congressional vote was the closest-ever vote to declare war in American history. None of the 39 Federalists in Congress voted for the war, whose critics later referred to it as "Mr. Madison's War". See also References Sources External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-98] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/w/index.php?title=Extraterrestrial_life&action=edit&section=1] | [TOKENS: 1432]
Editing Extraterrestrial life (section) Copy and paste: – — ° ′ ″ ≈ ≠ ≤ ≥ ± − × ÷ ← → · § Cite your sources: <ref></ref> {{}} {{{}}} | [] [[]] [[Category:]] #REDIRECT [[]] &nbsp; <s></s> <sup></sup> <sub></sub> <code></code> <pre></pre> <blockquote></blockquote> <ref></ref> <ref name="" /> {{Reflist}} <references /> <includeonly></includeonly> <noinclude></noinclude> {{DEFAULTSORT:}} <nowiki></nowiki> <!-- --> <span class="plainlinks"></span> Symbols: ~ | ¡ ¿ † ‡ ↔ ↑ ↓ • ¶ # ∞ ‹› «» ¤ ₳ ฿ ₵ ¢ ₡ ₢ $ ₫ ₯ € ₠ ₣ ƒ ₴ ₭ ₤ ℳ ₥ ₦ ₧ ₰ £ ៛ ₨ ₪ ৳ ₮ ₩ ¥ ♠ ♣ ♥ ♦ 𝄫 ♭ ♮ ♯ 𝄪 © ¼ ½ ¾ Latin: A a Á á À à  â Ä ä Ǎ ǎ Ă ă Ā ā à ã Å å Ą ą Æ æ Ǣ ǣ B b C c Ć ć Ċ ċ Ĉ ĉ Č č Ç ç D d Ď ď Đ đ Ḍ ḍ Ð ð E e É é È è Ė ė Ê ê Ë ë Ě ě Ĕ ĕ Ē ē Ẽ ẽ Ę ę Ẹ ẹ Ɛ ɛ Ǝ ǝ Ə ə F f G g Ġ ġ Ĝ ĝ Ğ ğ Ģ ģ H h Ĥ ĥ Ħ ħ Ḥ ḥ I i İ ı Í í Ì ì Î î Ï ï Ǐ ǐ Ĭ ĭ Ī ī Ĩ ĩ Į į Ị ị J j Ĵ ĵ K k Ķ ķ L l Ĺ ĺ Ŀ ŀ Ľ ľ Ļ ļ Ł ł Ḷ ḷ Ḹ ḹ M m Ṃ ṃ N n Ń ń Ň ň Ñ ñ Ņ ņ Ṇ ṇ Ŋ ŋ O o Ó ó Ò ò Ô ô Ö ö Ǒ ǒ Ŏ ŏ Ō ō Õ õ Ǫ ǫ Ọ ọ Ő ő Ø ø Œ œ Ɔ ɔ P p Q q R r Ŕ ŕ Ř ř Ŗ ŗ Ṛ ṛ Ṝ ṝ S s Ś ś Ŝ ŝ Š š Ş ş Ș ș Ṣ ṣ ß T t Ť ť Ţ ţ Ț ț Ṭ ṭ Þ þ U u Ú ú Ù ù Û û Ü ü Ǔ ǔ Ŭ ŭ Ū ū Ũ ũ Ů ů Ų ų Ụ ụ Ű ű Ǘ ǘ Ǜ ǜ Ǚ ǚ Ǖ ǖ V v W w Ŵ ŵ X x Y y Ý ý Ŷ ŷ Ÿ ÿ Ỹ ỹ Ȳ ȳ Z z Ź ź Ż ż Ž ž ß Ð ð Þ þ Ŋ ŋ Ə ə Greek: Ά ά Έ έ Ή ή Ί ί Ό ό Ύ ύ Ώ ώ Α α Β β Γ γ Δ δ Ε ε Ζ ζ Η η Θ θ Ι ι Κ κ Λ λ Μ μ Ν ν Ξ ξ Ο ο Π π Ρ ρ Σ σ ς Τ τ Υ υ Φ φ Χ χ Ψ ψ Ω ω {{Polytonic|}} Cyrillic: А а Б б В в Г г Ґ ґ Ѓ ѓ Д д Ђ ђ Е е Ё ё Є є Ж ж З з Ѕ ѕ И и І і Ї ї Й й Ј ј К к Ќ ќ Л л Љ љ М м Н н Њ њ О о П п Р р С с Т т Ћ ћ У у Ў ў Ф ф Х х Ц ц Ч ч Џ џ Ш ш Щ щ Ъ ъ Ы ы Ь ь Э э Ю ю Я я ́ IPA: t̪ d̪ ʈ ɖ ɟ ɡ ɢ ʡ ʔ ɸ β θ ð ʃ ʒ ɕ ʑ ʂ ʐ ç ʝ ɣ χ ʁ ħ ʕ ʜ ʢ ɦ ɱ ɳ ɲ ŋ ɴ ʋ ɹ ɻ ɰ ʙ ⱱ ʀ ɾ ɽ ɫ ɬ ɮ ɺ ɭ ʎ ʟ ɥ ʍ ɧ ʼ ɓ ɗ ʄ ɠ ʛ ʘ ǀ ǃ ǂ ǁ ɨ ʉ ɯ ɪ ʏ ʊ ø ɘ ɵ ɤ ə ɚ ɛ œ ɜ ɝ ɞ ʌ ɔ æ ɐ ɶ ɑ ɒ ʰ ʱ ʷ ʲ ˠ ˤ ⁿ ˡ ˈ ˌ ː ˑ ̪ {{IPA|}} This page is a member of 14 hidden categories (help):
========================================
[SOURCE: https://en.wikipedia.org/wiki/Goal] | [TOKENS: 2051]
Contents Goal A goal or objective is an idea of the future or desired result that a person or a group of people envision, plan, and commit to achieve. People endeavour to reach goals within a finite time by setting deadlines. A goal is roughly similar to a purpose or aim, the anticipated result which guides reaction, or an end, which is an object, either a physical object or an abstract object, that has intrinsic value. Goal setting Goal-setting theory was formulated based on empirical research and has been called one of the most important theories in organizational psychology. Edwin A. Locke and Gary P. Latham, the fathers of goal-setting theory, provided a comprehensive review of the core findings of the theory in 2002. In summary, Locke and Latham found that specific, difficult goals lead to higher performance than either easy goals or instructions to "do your best", as long as feedback about progress is provided, the person is committed to the goal, and the person has the ability and knowledge to perform the task. According to Locke and Latham, goals affect performance in the following ways: Some coaches recommend establishing specific, measurable, achievable, relevant, and time-bounded (SMART) objectives, but not all researchers agree that these SMART criteria are necessary. The SMART framework does not include goal difficulty as a criterion; in the goal-setting theory of Locke and Latham, it is recommended to choose goals within the 90th percentile of difficulty, based on the average prior performance of those that have performed the task. Goals can be long-term, intermediate, or short-term. The primary difference is the time required to achieve them. Short-term goals are expect to be finished in a relatively short period of time, long-term goals in a long period of time, and intermediate in a medium period of time. Before an individual can set out to achieve a goal, they must first decide on what their desired end-state will be. Peter Gollwitzer's mindset theory of action phases proposes that there are two phases in which an individual must go through if they wish to achieve a goal. For the first phase, the individual will mentally select their goal by specifying the criteria and deciding on which goal they will set based on their commitment to seeing it through. The second phase is the planning phase, in which the individual will decide which set of behaviors are at their disposal and will allow them to best reach their desired end-state or goal.: 342–348 Goal characteristics Certain characteristics of a goal help define the goal and determine an individual's motivation to achieve that goal. The characteristics of a goal make it possible to determine what motivates people to achieve a goal, and, along with other personal characteristics, may predict goal achievement.[citation needed] Personal goals Individuals can set personal goals: a student may set a goal of a high mark in an exam; an athlete might run five miles a day; a traveler might try to reach a destination city within three hours; an individual might try to reach financial goals such as saving for retirement or saving for a purchase. Managing goals can give returns in all areas of personal life. Knowing precisely what one wants to achieve makes clear what to concentrate and improve on, and often can help one subconsciously prioritize on that goal. However, successful goal adjustment (goal disengagement and goal re-engagement capacities) is also a part of leading a healthy life. Goal setting and planning ("goal work") promotes long-term vision, intermediate mission and short-term motivation. It focuses intention, desire, acquisition of knowledge, and helps to organize resources. Efficient goal work includes recognizing and resolving all guilt, inner conflict or limiting belief that might cause one to sabotage one's efforts. By setting clearly defined goals, one can subsequently measure and take pride in the accomplishment of those goals. One can see progress in what might have seemed a long, perhaps difficult, grind. Achieving complex and difficult goals requires focus, long-term diligence, and effort (see Goal pursuit). Success in any field requires forgoing excuses and justifications for poor performance or lack of adequate planning; in short, success requires emotional maturity. The measure of belief that people have in their ability to achieve a personal goal also affects that achievement. Long-term achievements rely on short-term achievements. Emotional control over the small moments of the single day can make a big difference in the long term. There has been a lot of research conducted looking at the link between achieving desired goals, changes to self-efficacy and integrity and ultimately changes to subjective well-being. Goal efficacy refers to how likely an individual is to succeed in achieving their goal. Goal integrity refers to how consistent one's goals are with core aspects of the self. Research has shown that a focus on goal efficacy is associated with happiness, a factor of well-being, and goal integrity is associated with meaning (psychology), another factor of well-being. Multiple studies have shown the link between achieving long-term goals and changes in subjective well-being; most research shows that achieving goals that hold personal meaning to an individual increases feelings of subjective well-being. Psychologist Robert Emmons found that when humans pursue meaningful projects and activities without primarily focusing on happiness, happiness often results as a by-product. Indicators of meaningfulness predict positive effects on life, while lack of meaning predicts negative states such as psychological distress. Emmons summarizes the four categories of meaning which have appeared throughout various studies. He proposes to call them WIST, or work, intimacy, spirituality, and transcendence. Furthermore, those who value extrinstic goals higher than intrinsic goals tend to have lower subjective well-being and higher levels of anxiety. Self-concordance model The self-concordance model is a model that looks at the sequence of steps that occur from the commencement of a goal to attaining that goal. It looks at the likelihood and impact of goal achievement based on the type of goal and meaning of the goal to the individual.[citation needed] Different types of goals impact both goal achievement and the sense of subjective well-being brought about by achieving the goal. The model breaks down factors that promote, first, striving to achieve a goal, then achieving a goal, and then the factors that connect goal achievement to changes in subjective well-being. Goals that are pursued to fulfill intrinsic values or to support an individual's self-concept are called self-concordant goals. Self-concordant goals fulfill basic needs and align with what psychoanalyst Donald Winnicott called an individual's "True Self". Because these goals have personal meaning to an individual and reflect an individual's self-identity, self-concordant goals are more likely to receive sustained effort over time. In contrast, goals that do not reflect an individual's internal drive and are pursued due to external factors (e.g. social pressures) emerge from a non-integrated region of a person, and are therefore more likely to be abandoned when obstacles occur. Those who attain self-concordant goals reap greater well-being benefits from their attainment. Attainment-to-well-being effects are mediated by need satisfaction, i.e., daily activity-based experiences of autonomy, competence, and relatedness that accumulate during the period of striving. The model is shown to provide a satisfactory fit to 3 longitudinal data sets and to be independent of the effects of self-efficacy, implementation intentions, avoidance framing, and life skills. Furthermore, self-determination theory and research surrounding this theory shows that if an individual effectively achieves a goal, but that goal is not self-endorsed or self-concordant, well-being levels do not change despite goal attainment. Goal setting management in organizations In organizations, goal management consists of the process of recognizing or inferring goals of individual team-members, abandoning goals that are no longer relevant, identifying and resolving conflicts among goals, and prioritizing goals consistently for optimal team-collaboration and effective operations. For any successful commercial system, it means deriving profits by making the best quality of goods or the best quality of services available to end-users (customers) at the best possible cost.[citation needed] Other business goals include retaining independence, personal satisfaction for business owners, and making a social contribution. For public bodies, goals include achieving improvement and demonstrating value for money, promoting democracy, and protecting people. Goal management includes: Jens Rasmussen and Morten Lind distinguish three fundamental categories of goals related to technological system management. These are: Organizational goal-management aims for individual employee goals and objectives to align with the vision and strategic goals of the entire organization. Goal-management provides organizations with a mechanism[which?] to effectively communicate corporate goals and strategic objectives to each person across the entire organization.[citation needed] The key consists of having it all emanate from a pivotal source and providing each person with a clear, consistent organizational-goal message, so that every employee understands how their efforts contribute to an enterprise's success.[citation needed] An example of goal types in business management: Goal displacement Goal displacement occurs when the original goals of an entity or organization are replaced over time by different goals. In some instances, this creates problems, because the new goals may exceed the capacity of the mechanisms put in place to meet the original goals. New goals adopted by an organization may also increasingly become focused on internal concerns, such as establishing and enforcing structures for reducing common employee disputes. In some cases, the original goals of the organization become displaced in part by repeating behaviors that become traditional within the organization. For example, a company that manufactures widgets may decide to do seek good publicity by putting on a fundraising drive for a popular charity or by having a tent at a local county fair. If the fundraising drive or county fair tent is successful, the company may choose to make this an annual tradition, and may eventually involve more and more employees and resources in the new goal of raising the most charitable funds or of having the best county fair tent. In some cases, goals are displaced because the initial problem is resolved or the initial goal becomes impossible to pursue. A famous example is the March of Dimes, which began as an organization to fund the fight against polio, but once that disease was effectively brought under control by the polio vaccine, transitioned to being an organization for combating birth defects. See also References Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/War_of_1812] | [TOKENS: 16830]
Contents War of 1812 1814 1813 1814 1815 East Coast Great Lakes / Saint Lawrence River West Indies / Gulf Coast Pacific Ocean The War of 1812 was fought by the United States and its allies against the United Kingdom and its allies in North America. It began when the United States declared war on Britain on 18 June 1812. Although peace terms were agreed upon in the December 1814 Treaty of Ghent, the war did not officially end until the peace treaty was ratified by the United States Congress on 17 February 1815. Anglo-American tensions stemmed from long-standing differences over territorial expansion in North America and British support for Tecumseh's confederacy, which resisted U.S. colonial settlement in the Old Northwest. In 1807, these tensions escalated after the Royal Navy began enforcing tighter restrictions on American trade with France and impressed sailors who were originally British subjects, even those who had acquired American citizenship. Opinion in the U.S. was split on how to respond, and although majorities in both the House and Senate voted for war in June 1812, they were divided along strict party lines, with the Democratic-Republican Party in favour and the Federalist Party against.[d] News of British concessions made in an attempt to avoid war did not reach the U.S. until late July, by which time the conflict was already underway. At sea, the Royal Navy imposed an effective blockade on U.S. maritime trade, while between 1812 and 1814 British regulars and colonial militia defeated a series of American invasions in Upper Canada. The April 1814 abdication of Napoleon allowed the British to send additional forces to North America and reinforce the Royal Navy blockade, crippling the American economy. In August 1814, negotiations began in Ghent, with both sides wanting peace; the British economy had been severely impacted by the trade embargo, while the Federalists convened the Hartford Convention in December to formalize their opposition to the war. In August 1814, British troops captured Washington, before American victories at Baltimore and Plattsburgh in September ended fighting in the north. In the Southeastern United States, American forces and Indian allies defeated an anti-American faction of the Muscogee. The Treaty of Ghent was signed in December 1814, though it would be February before word reached the United States and the treaty was fully ratified. In the interim, American troops led by Andrew Jackson repulsed a major British attack on New Orleans. Origins The origins of the War of 1812 (1812–1815), between the United States and the United Kingdom's British Empire and their First Nation allies, have been long debated. Multiple factors led to the US declaration of war on Great Britain that began the War of 1812: American expansion into the Northwest Territory (now Ohio, Indiana, Michigan, Illinois, Wisconsin, and northeast Minnesota) was impeded by Native American raids. Some historians maintain that an American goal in the war was to annex some or all of Canada, a view many Canadians still share.[citation needed] However, many argue that inducing the fear of such a seizure was merely an American tactic, which was designed to obtain a bargaining chip. Some members of the British Parliament and dissident American politicians such as John Randolph of Roanoke claimed that American expansionism, rather than maritime disputes, was the primary motivation for the American declaration of war. That view has been retained by some historians. Although the British made some concessions before the war on neutral trade, they insisted on the right to reclaim their deserting sailors. The British also had long had a goal to create a large "neutral" Native American state that would cover much of Ohio, Indiana, and Michigan. They made the demand as late as 1814 at the Ghent Peace Conference, but they lost battles that would have upheld those claims. Names The War of 1812 has been referred to by many different names. In Britain, the war is sometimes referred to as "The American War of 1812" in order distinguish it from the Napoleonic Wars which were occurring at the same time. The Federalist Party, which was staunchly against the war, referred to it as "Mr. Madison's War" (or just simply as "Madison's War") due to American President James Madison's declaration of war. Some in the United States refer to the war as the "Second War of American Independence" and as the "Second American Revolution" due to the belief that American sovereignty was once again on the line. Forces During the years 1810–1812, American naval ships were divided into two major squadrons, with the "northern division", based at New York, commanded by Commodore John Rodgers, and the "southern division", based at Norfolk, commanded by Commodore Stephen Decatur. Although not much of a threat to Canada in 1812, the United States Navy was a well-trained and professional force comprising over 5,000 sailors and marines. It had 14 ocean-going warships with three of its five "super-frigates" non-operational at the onset of the war. Its principal problem was lack of funding, as many in Congress did not see the need for a strong navy. The biggest ships in the American navy were frigates and there were no ships-of-the-line capable of engaging in a fleet action with the Royal Navy. On the high seas, the Americans pursued a strategy of commerce raiding, capturing or sinking British merchantmen with their frigates and privateers. The Navy was largely concentrated on the Atlantic coast before the war as it had only two gunboats on Lake Champlain, one brig on Lake Ontario and another brig in Lake Erie when the war began. The United States Army was initially much larger than the British Army in North America. Many men carried their own long rifles while the British were issued muskets, except for one unit of 500 riflemen. Leadership was inconsistent in the American officer corps as some officers proved themselves to be outstanding, but many others were inept, owing their positions to political favours. Congress was hostile to a standing army and the government called out 450,000 men from the state militias during the war. The state militias were poorly trained, armed, and led. The failed invasion of Lake Champlain led by General Dearborn illustrates this. The British Army soundly defeated the Maryland and Virginia militias at the Battle of Bladensburg in 1814 and President Madison commented "I could never have believed so great a difference existed between regular troops and a militia force, if I had not witnessed the scenes of this day". As the war progressed, the army resorted to harsher discipline, including increasing use of the death penalty, in order to maintain discipline. The United States was only a secondary concern to Britain, so long as the Napoleonic Wars continued with France. In 1813, France had 80 ships-of-the-line and was building another 35. Containing the French fleet was the main British naval concern, leaving only the ships on the North American and Jamaica Stations immediately available. In Upper Canada, the British had the Provincial Marine. While largely unarmed, they were essential for keeping the army supplied since the roads were abysmal in Upper Canada. At the onset of war, the Provincial Marine had four small armed vessels on Lake Ontario, three on Lake Erie and one on Lake Champlain. The Provincial Marine greatly outnumbered anything the Americans could bring to bear on the Great Lakes. When the war broke out, the British Army in North America numbered 9,777 men in regular units and fencibles.[e] While the British Army was engaged in the Peninsular War, few reinforcements were available. Although the British were outnumbered, the long-serving regulars and fencibles were better trained and more professional than the hastily expanded United States Army. The militias of Upper Canada and Lower Canada were initially far less effective, but substantial numbers of full-time militia were raised during the war and played pivotal roles in several engagements, including the Battle of the Chateauguay which caused the Americans to abandon the Saint Lawrence River theatre. The highly decentralized bands and tribes considered themselves allies of, and not subordinates to, the British or the Americans. Various tribes fighting with United States forces provided them with their "most effective light troops" while the British needed Indigenous allies to compensate for their numerical inferiority. The Indigenous allies of the British, Tecumseh's confederacy in the west and Iroquois in the east, avoided pitched battles and relied on irregular warfare, including raids and ambushes that took advantage of their knowledge of terrain. In addition, they were highly mobile, able to march 30–50 miles (50–80 km) a day. Their leaders sought to fight only under favourable conditions and would avoid any battle that promised heavy losses, doing what they thought best for their tribes. The Indigenous fighters saw no issue with withdrawing if needed to save casualties. They always sought to surround an enemy, where possible, to avoid being surrounded and make effective use of the terrain. Their main weapons were a mixture of muskets, rifles, bows, tomahawks, knives and swords as well as clubs and other melee weapons, which sometimes had the advantage of being quieter than guns. Declaration of war On 1 June 1812, Madison sent a message to Congress recounting American grievances against Great Britain, though not specifically calling for a declaration of war. The House of Representatives then deliberated for four days behind closed doors before voting 79 to 49 (61%) in favour of the first declaration of war. The Senate concurred in the declaration by a 19 to 13 (59%) vote in favour. The declaration focused mostly on maritime issues, especially involving British blockades, with two thirds of the indictment devoted to such impositions, initiated by Britain's Orders in Council.[f] The conflict began formally on 18 June 1812, when Madison signed the measure into law. He proclaimed it the next day. This was the first time that the United States had formally declared war on another nation, and the Congressional vote was approved by the smallest margin of any declaration of war in America's history. None of the 39 Federalists in Congress voted in favour of the war, while other critics referred to it as "Mr. Madison's War". Just days after war had been declared, a small number of Federalists in Baltimore were attacked for printing anti-war views in a newspaper, which eventually led to over a month of deadly rioting in the city. Prime Minister Spencer Perceval was assassinated in London on 11 May and Lord Liverpool came to power. He wanted a more practical relationship with the United States. On 23 June, he issued a repeal of the Orders in Council, but the United States was unaware of this, as it took three weeks for the news to cross the Atlantic. On 28 June 1812, HMS Colibri was dispatched from Halifax to New York under a flag of truce. She anchored off Sandy Hook on July 9 and left three days later carrying a copy of the declaration of war, British ambassador to the United States Augustus Foster and consul Colonel Thomas Henry Barclay. She arrived in Halifax, Nova Scotia eight days later. The news of the declaration took even longer to reach London. British commander Isaac Brock in Upper Canada received the news much faster. He issued a proclamation alerting citizens to the state of war and urging all military personnel "to be vigilant in the discharge of their duty", so as to prevent communication with the enemy and to arrest anyone suspected of helping the Americans. He also ordered the British garrison of Fort St. Joseph on Lake Huron to capture the American fort at Mackinac. This fort commanded the passage between Lakes Huron and Michigan, which was important to the fur trade. The British garrison, aided by fur traders of the North West Company and Sioux, Menominee, Winnebago, Chippewa, and Ottawa, immediately besieged and captured Mackinac. Course of war The war was conducted in several theatres: The war had been preceded by years of diplomatic dispute, yet neither side was ready for war when it came. Britain was heavily engaged in the Napoleonic Wars, most of the British Army was deployed in the Peninsular War in Portugal and Spain, and the Royal Navy was blockading most of the coast of Europe. The number of British regular troops present in Canada in July 1812 was officially 6,034, supported by additional Canadian militia. Throughout the war, the British War Secretary was Earl Bathurst, who had few troops to spare for reinforcing North American defences during the first two years of the war. He urged Lieutenant General George Prevost to maintain a defensive strategy. Prevost, who had the trust of the Canadians, followed these instructions and concentrated on defending Lower Canada at the expense of Upper Canada, which was more vulnerable to American attacks and allowed few offensive actions. Unlike campaigns along the east coast, Prevost had to operate with no support from the Royal Navy. The United States was also not prepared for war. Madison had assumed that the state militias would easily seize Canada and that negotiations would follow. In 1812, the regular army consisted of fewer than 12,000 men. Congress authorized the expansion of the army to 35,000 men, but the service was voluntary and unpopular; it paid poorly and there were initially few trained and experienced officers. The militia objected to serving outside their home states, they were undisciplined and performed poorly against British forces when called upon to fight in unfamiliar territory. Multiple militias refused orders to cross the border and fight on Canadian soil. American prosecution of the war suffered from its unpopularity, especially in New England where anti-war speakers were vocal. Massachusetts Congressmen Ebenezer Seaver and William Widgery were "publicly insulted and hissed" in Boston while a mob seized Plymouth's Chief Justice Charles Turner on 3 August 1812 "and kicked [him] through the town". The United States had great difficulty financing its war. It had disbanded its national bank, and private bankers in the Northeast were opposed to the war, but it obtained financing from London-based Barings Bank to cover overseas bond obligations. New England failed to provide militia units or financial support, which was a serious blow, and New England states made loud threats to secede as evidenced by the Hartford Convention. Britain exploited these divisions, opting to not blockade the ports of New England for much of the war and encouraging smuggling. An American army commanded by William Hull invaded Upper Canada on July 12, arriving at Sandwich (Windsor, Ontario) after crossing the Detroit River. Hull issued a proclamation ordering all British subjects to surrender. The proclamation said that Hull wanted to free them from the "tyranny" of Great Britain, giving them the liberty, security, and wealth that his own country enjoyed – unless they preferred "war, slavery and destruction". He also threatened to kill any British soldier caught fighting alongside Indigenous fighters. Hull's proclamation only helped to stiffen resistance to the American attacks as he lacked artillery and supplies. Hull withdrew to the American side of the river on 7 August 1812 after receiving news of a Shawnee ambush on Major Thomas Van Horne's 200 men, who had been sent to support the American supply convoy. Hull also faced a lack of support from his officers and fear among his troops of a possible massacre by unfriendly Indigenous forces. A group of 600 troops led by Lieutenant Colonel James Miller remained in Canada, attempting to supply the American position in the Sandwich area, with little success. Major General Isaac Brock believed that he should take bold measures to calm the settler population in Canada and to convince the tribes that Britain was strong. He moved to Amherstburg near the western end of Lake Erie with reinforcements and attacked Detroit, using Fort Malden as his stronghold. Hull feared that the British possessed superior numbers, and Fort Detroit lacked adequate gunpowder and cannonballs to withstand a long siege. He agreed to surrender on 16 August. Hull also ordered the evacuation of Fort Dearborn (Chicago) to Fort Wayne, but Potawatomi warriors ambushed them and escorted them back to the fort where they were massacred on 15 August. The fort was subsequently burned.[g] Brock moved to the eastern end of Lake Erie, where American General Stephen Van Rensselaer was attempting a second invasion. The Americans attempted an attack across the Niagara River on 13 October, but they were defeated at Queenston Heights. However, Brock was killed during the battle and British leadership suffered after his death. American General Henry Dearborn made a final attempt to advance north from Lake Champlain, but his militia refused to go beyond American territory. After Hull surrendered Detroit, General William Henry Harrison took command of the American Army of the Northwest. He set out to retake the city, which was now defended by Colonel Henry Procter and Tecumseh. A detachment of Harrison's army was defeated at Frenchtown along the River Raisin on 22 January 1813. Procter left the prisoners with an inadequate guard and his Potawatomie allies killed and scalped 60 captive Americans. The defeat ended Harrison's campaign against Detroit, but "Remember the River Raisin!" became a rallying cry for the Americans. In May 1813, Procter and Tecumseh set siege to Fort Meigs in northwestern Ohio. Tecumseh's fighters ambushed American reinforcements who arrived during the siege, but the fort held out. The fighters eventually began to disperse, forcing Procter and Tecumseh to return to Canada. Along the way they attempted to storm Fort Stephenson, a small American post on the Sandusky River near Lake Erie. They were repulsed with serious losses, marking the end of the Ohio campaign. Captain Oliver Hazard Perry fought the Battle of Lake Erie on 10 September 1813. His decisive victory at Put-in-Bay ensured American military control of the lake, improved American morale after a series of defeats and compelled the British to fall back from Detroit. This enabled General Harrison to launch another invasion of Upper Canada, which culminated in the American victory at the Battle of the Thames on 5 October 1813, where Tecumseh was killed. The Mississippi River valley was the western frontier of the United States in 1812. The territory acquired in the Louisiana Purchase of 1803 contained almost no American settlements west of the Mississippi except around St. Louis and a few forts and trading posts in the Boonslick. Fort Belle Fontaine was an old trading post converted to an Army post in 1804 and this served as regional headquarters. Fort Osage, built in 1808 along the Missouri River, was the westernmost American outpost, but it was abandoned at the start of the war. Fort Madison was built along the Mississippi in Iowa in 1808 and had been repeatedly attacked by British-allied Sauk since its construction. The United States Army abandoned Fort Madison in September 1813 after the indigenous fighters attacked it and besieged it – with support from the British. This was one of the few battles fought west of the Mississippi. Black Hawk played a leadership role. The American victory on Lake Erie and the recapture of Detroit isolated the British on Lake Huron. In the winter a Canadian party under Lieutenant Colonel Robert McDouall established a new supply line from York to Nottawasaga Bay on Georgian Bay. He arrived at Fort Mackinac on 18 May with supplies and more than 400 militia and Indians, then sent an expedition which successfully besieged and recaptured the key trading post of Prairie du Chien, on the Upper Mississippi. The Americans dispatched a substantial expedition to relieve the fort, but Sauk, Fox, and Kickapoo warriors under Black Hawk ambushed it and forced it to withdraw with heavy losses in the Battle of Rock Island Rapids. In September 1814, the Sauk, Fox, and Kickapoo, supported by part of Prairie du Chien's British garrison, repulsed a second American force led by Major Zachary Taylor in the Battle of Credit Island. These victories enabled the Sauk, Fox, and Kickapoo to harass American garrisons further to the south, which led the Americans to abandon Fort Johnson, in central Illinois Territory. Consequently, the Americans lost control of almost all of Illinois Territory, although they held onto the St. Louis area and eastern Missouri. However, the Sauk raided even into these territories, clashing with American forces at the Battle of Cote Sans Dessein in April 1815 at the mouth of the Osage River in the Missouri Territory and the Battle of the Sink Hole in May 1815 near Fort Cap au Gris. This left the British and their Indian allies in control of most of modern Illinois and all of modern Wisconsin. Meanwhile, the British were supplying the Indians in the Old Northwest from Montreal via Mackinac. On 3 July, the Americans sent a force of five vessels from Detroit to recapture Mackinac. A mixed force of regulars and volunteers from the militia landed on the island on 4 August. They did not attempt to achieve surprise, and Indians ambushed them in the brief Battle of Mackinac Island and forced them to re-embark. The Americans discovered the new base at Nottawasaga Bay and on 13 August they destroyed its fortifications and the schooner Nancy that they found there. They then returned to Detroit, leaving two gunboats to blockade Mackinac. On 4 September, the British surprised, boarded, and captured both gunboats. These engagements on Lake Huron left Mackinac under British control. The British returned Mackinac and other captured territory to the United States after the war. Some British officers and Canadians objected to handing back Prairie du Chien and especially Mackinac under the terms of the Treaty of Ghent. However, the Americans retained the captured post at Fort Malden near Amherstburg until the British complied with the treaty. Fighting between Americans, the Sauk and other indigenous tribes continued through 1817, well after the war ended in the east. Both sides placed great importance on gaining control of the Great Lakes and the St. Lawrence River because of the difficulties of land-based communication. The British already had a small squadron of warships on Lake Ontario when the war began and had the initial advantage. The Americans established a Navy yard at Sackett's Harbor, New York, a port on Lake Ontario. Commodore Isaac Chauncey took charge of the thousands of sailors and shipwrights assigned there and recruited more from New York. They completed a warship (the corvette USS Madison) in 45 days. Ultimately, almost 3,000 men at the shipyard built 11 warships and many smaller boats and transports. Army forces were also stationed at Sackett's Harbor, where they camped out through the town, far surpassing the small population of 900. Officers were housed with families. Madison Barracks was later built at Sackett's Harbor. Having regained the advantage by their rapid building program, on 27 April 1813 Chauncey and Dearborn attacked York, the capital of Upper Canada. At the Battle of York, the outnumbered British regulars destroyed the fort and dockyard and retreated, leaving the militia to surrender the town. American soldiers set fire to the Legislature building, and looted and vandalized several government buildings and citizens' homes. The burning of York was pivotal for the British, and resulted in the absence of supplies that would be needed in later battles. On 25 May 1813, Fort Niagara and the American Lake Ontario squadron began bombarding Fort George. An American amphibious force assaulted Fort George on the northern end of the Niagara River on 27 May and captured it without serious losses. The British abandoned Fort Erie and headed towards Burlington Heights. The British position was close to collapsing in Upper Canada; the Iroquois considered changing sides and ignored a British appeal to come to their aid. However, the Americans did not pursue the retreating British forces until they had largely escaped and organized a counter-offensive at the Battle of Stoney Creek on 5 June. The British launched a surprise attack at 2 a.m., leading to confused fighting and a strategic British victory. The Americans pulled back to Forty Mile Creek rather than continue their advance into Upper Canada. At this point, the Six Nations of the Grand River began to come out to fight for the British as an American victory no longer seemed inevitable. The Iroquois ambushed an American patrol at Forty Mile Creek while the Royal Navy squadron based in Kingston sailed in and bombarded the American camp. General Dearborn retreated to Fort George, mistakenly believing that he was outnumbered and outgunned. British Brigadier General John Vincent was encouraged when about 800 Iroquois arrived to assist him. An American force surrendered on 24 June to a smaller British force due to advance warning by Laura Secord at the Battle of Beaver Dams, marking the end of the American offensive into Upper Canada. British Major General Francis de Rottenburg did not have the strength to retake Fort George, so he instituted a blockade, hoping to starve the Americans into surrender. Meanwhile, Commodore James Lucas Yeo had taken charge of the British ships on the lake and mounted a counterattack, which the Americans repulsed at the Battle of Sackett's Harbor. Thereafter, Chauncey and Yeo's squadrons fought two indecisive actions, off the Niagara on 7 August and at Burlington Bay on 28 September. Neither commander was prepared to take major risks to gain a complete victory. Late in 1813, the Americans abandoned the Canadian territory that they occupied around Fort George. They set fire to the village of Newark (now Niagara-on-the-Lake) on 10 December 1813, incensing the Canadians. Many of the inhabitants were left without shelter, freezing to death in the snow. The British retaliated following their Capture of Fort Niagara on 18 December 1813. A British-Indian force led by Riall stormed the neighbouring town of Lewiston, New York on 19 December; four American civilians were killed by drunken Indians after the battle. A small force of Tuscarora warriors engaged Riall's men during the battle, which allowed many residents of Lewiston to evacuate the village. The British and their Indian allies subsequently attacked and burned Buffalo on Lake Erie on 30 December 1813 in revenge for the American attack on Fort George and Newark in May. The British were vulnerable along the stretch of the St. Lawrence that was between Upper Canada and the United States. In the winter of 1812–1813, the Americans launched a series of raids from Ogdensburg, New York that hampered British supply traffic up the river. On 21 February, George Prevost passed through Prescott, Ontario on the opposite bank of the river with reinforcements for Upper Canada. When he left the next day, the reinforcements and local militia attacked in the Battle of Ogdensburg and the Americans were forced to retreat. The Americans made two more thrusts against Montreal in 1813. Major General Wade Hampton was to march north from Lake Champlain and join a force under General James Wilkinson that would sail from Sackett's Harbor on Lake Ontario and descend the St. Lawrence. Hampton was delayed by road and supply problems and his intense dislike of Wilkinson limited his desire to support his plan. Charles de Salaberry defeated Hampton's force of 4,000 at the Chateauguay River on 25 October with a smaller force of Canadian Voltigeurs and Mohawks. Salaberry's force numbered only 339, but it had a strong defensive position. Wilkinson's force of 8,000 set out on 17 October, but it was delayed by weather. Wilkinson heard that a British force was pursuing him under Captain William Mulcaster and Lieutenant Colonel Joseph Wanton Morrison and landed near Morrisburg, Ontario by 10 November, about 150 kilometres (90 mi) from Montreal. On 11 November, his rear guard of 2,500 attacked Morrison's force of 800 at Crysler's Farm and was repulsed with heavy losses. He learned that Hampton could not renew his advance, retreated to the United States and settled into winter quarters. He resigned his command after a failed attack on a British outpost at Lacolle Mills. The Americans again invaded the Niagara frontier. They had occupied southwestern Upper Canada after they defeated Colonel Henry Procter at Moraviantown in October and believed that taking the rest of the province would force the British to cede it to them. The end of the war with Napoleon in Europe in April 1814 meant that the British could deploy their army to North America, so the Americans wanted to secure Upper Canada to negotiate from a position of strength. They planned to invade via the Niagara frontier while sending another force to recapture Mackinac. They captured Fort Erie on 3 July 1814. Unaware of Fort Erie's fall or of the size of the American force, the British general Phineas Riall engaged with Winfield Scott, who won against a British force at the Battle of Chippawa on 5 July. The American forces had been through a hard training under Winfield Scott and proved to the professionals under fire. They deployed in a shallow U formation, bringing flanking fire and well-aimed volleys against Riall's men. Riall's men were chased off the battlefield. An attempt to advance further ended with the hard-fought but inconclusive Battle of Lundy's Lane on July 25. The battle was fought several miles north of Chippawa Creek near Niagara Falls and is considered the bloodiest and costliest battle of the war. Both sides stood their ground as American General Jacob Brown pulled back to Fort George after the battle and the British did not pursue. Commanders Riall, Scott, Brown, and Drummond were all wounded; Scott's wounds ended his service in the war. The Americans withdrew but withstood a prolonged siege of Fort Erie. The British tried to storm Fort Erie on 14 August 1814, but they suffered heavy losses, losing 950 killed, wounded, and captured, compared to only 84 dead and wounded on the American side. The British were further weakened by exposure and shortage of supplies. Eventually, they raised the siege, but American Major General George Izard took over command on the Niagara front and followed up only halfheartedly. An American raid along the Grand River destroyed many farms and weakened British logistics. In October 1814, the Americans advanced into Upper Canada and engaged in skirmishes at Cook's Mill. They pulled back when they heard of the approach of the new British warship HMS St Lawrence, launched in Kingston that September and armed with 104 guns. The Americans lacked provisions and retreated across the Niagara after destroying Fort Erie. Meanwhile, after Napoleon abdicated, 15,000 British troops were sent to North America under four of Wellington's ablest brigade commanders. Fewer than half were veterans of the Peninsular War and the rest came from garrisons. Prevost was ordered to burn Sackett's Harbor to gain naval control of Lake Erie, Lake Ontario, and the Upper Lakes, and to defend Lower Canada from attack. He did defend Lower Canada but otherwise failed to achieve his objectives, so he decided to invade New York State. His army outnumbered the American defenders of Plattsburgh under General Alexander Macomb, but he was worried about his flanks and decided that he needed naval control of Lake Champlain. Upon reaching Plattsburgh, Prevost delayed the assault until Captain George Downie arrived in the hastily built 36-gun frigate HMS Confiance. Confiance was not fully completed, and her raw crew had never worked together, but Prevost forced Downie into a premature attack. The British squadron on the lake was more evenly matched by the Americans under Master Commandant Thomas Macdonough. At the Battle of Plattsburgh on 11 September 1814, Confiance suffered heavy casualties and struck her colours, and the rest of the British fleet retreated. Prevost, already alienated from his veteran officers by insisting on proper dress codes, now lost their confidence, while Macdonough emerged as a national hero. The Americans now had control of Lake Champlain; Theodore Roosevelt later termed it "the greatest naval battle of the war". Prevost then turned back, to the astonishment of his senior officers, saying that it was too hazardous to remain on enemy territory after the loss of naval supremacy. He was recalled to London, where a naval court-martial decided that defeat had been caused principally by Prevost urging the squadron into premature action and then failing to afford the promised support from the land forces. He died suddenly, just before his court-martial was to convene. His reputation sank to a new low as Canadians claimed that their militia under Brock did the job but Prevost failed. However, recent historians have been kinder. Peter Burroughs argues that his preparations were energetic, well-conceived, and comprehensive for defending the Canadas with limited means and that he achieved the primary objective of preventing an American conquest. Maine, then part of Massachusetts, was a base for smuggling and illegal trade between the United States and the British. Until 1813, the region was generally quiet except for privateer actions near the coast. In September 1813, the United States Navy's brig Enterprise fought and captured the Royal Navy brig Boxer off Pemaquid Point. On 11 July 1814, Thomas Masterman Hardy took Moose Island (Eastport, Maine) without a shot and the entire American garrison, 65 men of Fort Sullivan peacefully surrendered. The British temporarily renamed the captured fort "Fort Sherbrooke". In September 1814, John Coape Sherbrooke led 3,000 British troops from his base in Halifax in the "Penobscot Expedition". In 26 days, he raided and looted Hampden, Bangor and Machias, destroying or capturing 17 American ships. He won the Battle of Hampden, with two killed while the Americans had one killed. Retreating American forces were forced to destroy the frigate Adams. The British occupied the town of Castine and most of eastern Maine for the rest of the war, governing it under martial law and re-establishing the colony of New Ireland. The Treaty of Ghent returned this territory to the United States. When the British left in April 1815, they took £10,750 in tariff duties from Castine. This money, called the "Castine Fund", was used to establish Dalhousie University in Halifax. Decisions about the islands in Passamaquoddy Bay were decided by joint commission in 1817. However, Machias Seal Island had been seized by the British as part of the occupation and was unaddressed by the commission. While kept by Britain/Canada, it remains in dispute to this day. The strategic location of the Chesapeake Bay near the Potomac River made it a prime target for the British. Rear Admiral George Cockburn arrived there in March 1813 and was joined by Admiral Warren who took command of operations ten days later. Starting in March a squadron under Cockburn started a blockade of the mouth of the Bay at Hampton Roads harbour and raided towns along the Bay from Norfolk, Virginia to Havre de Grace, Maryland. In late April Cockburn landed at and set fire to Frenchtown, Maryland and destroyed ships that were docked there. In the following weeks he routed the local militias and looted and burned three other towns. Thereafter he marched to iron foundry at Principio and destroyed it along with sixty-eight cannons. On 4 July 1813, Commodore Joshua Barney, an American Revolutionary War naval officer, convinced the Navy Department to build the Chesapeake Bay Flotilla, a squadron of twenty barges powered by small sails or oars (sweeps) to defend the Chesapeake Bay. Launched in April 1814, the squadron was quickly cornered on the Patuxent River. While successful in harassing the Royal Navy, they could not stop subsequent British operations in the area. In August 1814, a force of 2,500 soldiers under General Ross had just arrived in Bermuda aboard HMS Royal Oak, three frigates, three sloops and ten other vessels. Released from the Peninsular War by victory, the British intended to use them for diversionary raids along the coasts of Maryland and Virginia. In response to Prevost's request to retaliate against the property destruction done by American troops, they decided to employ this force, together with the naval and military units already on the station, to strike at the national capital. Anticipating the attack, valuable documents, including the original Constitution, were removed to Leesburg, Virginia. The British task force advanced up the Chesapeake, routing Commodore Barney's flotilla of gunboats, carried out the Raid on Alexandria, landed ground forces that bested the US defenders at the Battle of Bladensburg, and carried out the Burning of Washington. United States Secretary of War John Armstrong Jr. insisted that the British were going to attack Baltimore rather than Washington, even as British army and naval units were on their way to Washington. Brigadier General William H. Winder, who had burned several bridges in the area, assumed the British would attack Annapolis and was reluctant to engage because he mistakenly thought the British army was twice its size. The inexperienced state militia was easily routed in the Battle of Bladensburg, opening the route to Washington. British troops led by Major General Robert Ross, accompanied by Cockburn, the 3rd Brigade attacked and captured Washington with a force of 4,500. On 24 August, after the British had finished looting the interiors, Ross directed his troops to set fire to number of public buildings, including the White House and the United States Capitol.[h] Extensive damage to the interiors and the contents of both were subsequently reported. US government and military officials fled to Virginia, while Secretary of the United States Navy William Jones ordered the Washington Navy Yard and a nearby fort to be razed in order to prevent its capture. Public buildings in Washington were destroyed by the British though private residences ordered spared. After taking some munitions from the Washington Munitions depot, the British boarded their ships and moved on to their major target, the heavily fortified major city of Baltimore. Because some of their ships were held up in the Raid on Alexandria, they delayed their movement allowing Baltimore an opportunity to strengthen the fortifications and bring in new federal troops and state militia units. The "Battle for Baltimore" began with the British landing on 12 September 1814 at North Point, where they were met by American militia further up the Patapsco Neck peninsula. An exchange of fire began, with casualties on both sides. The British Army commander Major Gen. Robert Ross was killed by snipers. The British paused, then continued to march northwestward to face the stationed Maryland and Baltimore City militia units at Godly Wood. The Battle of North Point was fought for several hours in the afternoon in a musketry and artillery duel. The British also planned to simultaneously attack Baltimore by water on the following day, although the Royal Navy was unable to reduce Fort McHenry at the entrance to Baltimore Harbor in support of an attack from the northeast by the British Army.[citation needed] The British eventually realized that they could not force the passage to attack Baltimore in coordination with the land force. A last-ditch night feint and barge attack during a heavy rainstorm was led by Captain Charles Napier around the fort up the Middle Branch of the river to the west. Split and misdirected partly in the storm, it turned back after suffering heavy casualties from the alert gunners of Fort Covington and Battery Babcock. The British called off the attack and sailed downriver to pick up their army, which had retreated from the east side of Baltimore. All the lights were extinguished in Baltimore the night of the attack, and the fort was bombarded for 25 hours. The only light was given off by the exploding shells over Fort McHenry, illuminating the flag that was still flying over the fort. The defence of the fort inspired the American lawyer Francis Scott Key to write "Defence of Fort M'Henry", a poem that was later set to music as "The Star-Spangled Banner". Because of the region's polyglot population, both the British and the Americans perceived the war in the Gulf South as a fundamentally different conflict from the one occurring in the Lowcountry and Chesapeake. Before 1813, the war between the Creeks, or Muscogee, had been largely an internal affair sparked by the ideas of Tecumseh farther north in the Mississippi Valley. A faction known as the Red Sticks, so named for the colour of their war sticks, had broken away from the rest of the Creek Confederacy, which wanted peace with the United States. The Red Sticks were allied with Tecumseh, who had visited the Creeks about a year before 1813 and encouraged greater resistance to the Americans. The Creek Nation was a trading partner of the United States, actively involved with British and Spanish trade as well. The Red Sticks as well as many southern Muscogee people like the Seminole had a long history of alliance with the British and Spanish empires. This alliance helped the North American and European powers protect each other's claims to territory in the south. On 27 July the Red Sticks were returning from Pensacola with a pack train filled with trade goods and arms when they were attacked by Americans who made off with their goods. On 30 August 1813, in retaliation for the raid, the Red Sticks, led by chiefs of the Creeks Red Eagle and Peter McQueen, attacked Fort Mims north of Mobile, the only American-held port in the territory of West Florida. The attack on Fort Mims resulted in the deaths of 400 refugee settlers, all butchered and scalped, including women and children, and became an ideological rallying point for the Americans. It prompted the state of Georgia and the Mississippi militia to immediately take major action against Creek offensives. The Red Sticks chiefs gained power in the east along the Alabama River, Coosa River and Tallapoosa River in the Upper Creek territory. By contrast, the Lower Creek, who lived along the Chattahoochee River, generally opposed the Red Sticks and wanted to remain allied to the U.S. Indian agent Benjamin Hawkins recruited Lower Creek to aid the 6th Military District under General Thomas Pinckney and the state militias against the Red Sticks. The United States combined forces were 5,000 troops from East and West Tennessee, with about 200 indigenous allies. At its peak, the Red Stick faction had 4,000 warriors, only a quarter of whom had muskets. The Indian frontier of western Georgia was the most vulnerable but was partially fortified already. From November 1813 to January 1814, Georgia's militia[clarification needed] and auxiliary Federal troops from the Creek and Cherokee indigenous nations and the states of North Carolina and South Carolina organized the fortification of defences along the Chattahoochee River and expeditions into Upper Creek territory in present-day Alabama. The army, led by General John Floyd, went to the heart of the Creek Holy Grounds and won a major offensive against one of the largest Creek towns at the Battle of Autossee, killing an estimated two hundred people. In November, the militia of Mississippi with a combined 1,200 troops attacked the Econachca encampment in the Battle of Holy Ground on the Alabama River. Tennessee raised a militia of 5,000 under Major General Andrew Jackson and Brigadier General John Coffee and won the battles of Tallushatchee and Talladega in November 1813. Jackson suffered enlistment problems in the winter. He decided to combine his force, composed of Tennessee militia and pro-American Creek, with the Georgia militia. In January, however, the Red Sticks attacked his army at the Battles of Emuckfaw and Enotachopo Creek. Jackson's troops repelled the attackers, but they were outnumbered and forced to withdraw to his base at Fort Strother. In January, Floyd's force of 1,300 state militia and 400 Creek moved to join the United States forces in Tennessee, but they were attacked in camp on the Calibee Creek by Tukabatchee Muscogees on 27 January.[citation needed] Jackson's force increased in numbers with the arrival of United States Army soldiers and a second draft of Tennessee state militia, Cherokee, and pro-American Creek swelled his army to around 5,000. In March 1814, they moved south to attack the Red Sticks. On 27 March, Jackson decisively defeated a force of about a thousand Red Sticks at Horseshoe Bend, killing 800 of them at a cost of 49 killed and 154 wounded. Jackson then moved his army to Fort Jackson on the Alabama River. He promptly turned on the pro-American Creek who had fought with him and compelled their chieftains, along with a single Red Stick chieftain, to sign the Treaty of Fort Jackson, which forced the Creek tribe as a whole to cede most of western Georgia and part of Alabama to the U.S. Both Hawkins and the pro-American Creek strongly opposed the treaty, which they regarded as deeply unjust. The third clause of the treaty also demanded that the Creek cease communicating with the British and Spanish, and trade only with United States-approved agents.[failed verification] British aid to the Red Sticks arrived after the end of the Napoleonic Wars in April 1814 and after Admiral Alexander Cochrane assumed command from Admiral Warren in March. Captain Hugh Pigot arrived in May 1814 with two ships to arm the Red Sticks. He thought that some 6,600 warriors could be armed and recruited. It was overly optimistic at best. The Red Sticks were in the process of being destroyed as a military force. Cochrane underestimated Jackson's competence, and was likely unaware of his progress over the Creek, even after his victory. In April 1814, the British established an outpost on the Apalachicola River (Prospect Bluff Historic Sites). Cochrane sent a company of Royal Marines commanded by Edward Nicolls, the vessels HMS Hermes and HMS Carron and further supplies to meet the Indians in the region. In addition to training them, Nicolls was tasked to raise a force from escaped slaves as part of the Corps of Colonial Marines. On 12 July 1814, General Jackson complained to the governor of West Florida, Mateo González Manrique, situated at Pensacola that combatants from the Creek War were being harboured in Spanish territory and made reference to reports of the British presence on Spanish soil. Although he gave an angry reply to Jackson, Manrique was alarmed at the weak position he found himself in and appealed to the British for help. The British were observed docking on August 25 and unloading the following day. The first engagement of the British and their Creek allies against the Americans on the Gulf of Mexico coast was the 14 September 1814 attack on Fort Bowyer. Captain William Percy tried to take the United States fort, hoping to then move on Mobile and block United States trade and encroachment on the Mississippi. After the Americans repulsed Percy's forces, the British established a military presence of up to 200 Marines at Pensacola. In November, Jackson's force of 4,000 men took the town. This underlined the superiority of numbers of Jackson's force in the region. The United States force moved to New Orleans in late 1814. Jackson's army of 1,000 regulars and 3,000 to 4,000 militia, pirates and other fighters as well as civilians and slaves built fortifications south of the city. American forces under General James Wilkinson, himself a paid Spanish secret agent, took the Mobile area from the Spanish in March 1813. This region was the rump of Spanish West Florida, the western portion of which had been annexed to the United States in 1810. The Americans built Fort Bowyer, a log and earthen-work fort with 14 guns, on Mobile Point to defend it. Major Latour opined that none of the three forts in the area were capable of resisting a siege. At the end of 1814, the British launched a double offensive in the South weeks before the Treaty of Ghent was signed. On the Atlantic coast, Admiral George Cockburn was to close the Intracoastal Waterway trade and land Royal Marine battalions to advance through Georgia to the western territories. While on the Gulf coast, Admiral Alexander Cochrane moved on the new state of Louisiana and the Mississippi Territory. Cochrane's ships reached the Louisiana coast on 9 December and Cockburn arrived in Georgia on 14 December. The British army had the objective of gaining control of the entrance of the Mississippi. To this end, an expeditionary force of 8,000 troops under General Edward Pakenham attacked Jackson's prepared defences in New Orleans on 8 January 1815. The Battle of New Orleans was an American victory, as the British failed to take the fortifications on the East Bank. The British attack force suffered high casualties, including 291 dead, 1,262 wounded and 484 captured or missing whereas American casualties were light with 13 dead, 39 wounded and 19 missing, according to the respective official casualty returns. This battle was hailed as a great victory across the United States, making Jackson a national hero and eventually propelling him to the presidency. In January 1815 Fort St. Philip endured ten days of bombardment from two bomb vessels of the Royal Navy. Robert V. Remini believes this was preventing the British moving their fleet up the Mississippi in support of the land attack. After deciding further attacks would be too costly and unlikely to succeed, the British troops withdrew on 18 January. However, adverse winds slowed the evacuation operation and it was not until 27 January 1815 that the land forces rejoined the fleet, allowing for its final departure. After New Orleans, the British moved to take Mobile as a base for further operations. In preparation, General John Lambert laid siege to Fort Bowyer taking it on 12 February 1815. However, HMS Brazen brought news of the Treaty of Ghent the next day and the British abandoned the Gulf Coast. This ending of the war prevented the capture of Mobile, and any renewed attacks on New Orleans. Meanwhile, in January 1815, Cockburn succeeded in blockading the southeastern coast of Georgia by occupying Camden County. The British quickly took Cumberland Island, Fort Point Peter and Fort St. Tammany in a decisive victory. Under the orders of his commanding officers, Cockburn's forces relocated many refugee slaves, capturing St. Simons Island as well to do so. He had orders to recruit as many runaway slaves into the Corps of Colonial Marines as possible and use them to conduct raids in Georgia and the Carolinas. Cockburn also provided thousands of muskets and carbines and a huge quantity of ammunition to the Creeks and Seminole Indians for the same purpose. During the invasion of the Georgia coast, an estimated 1,485 people chose to relocate to British territories or join the British military. However, by mid-March, several days after being informed of the Treaty of Ghent, British ships left the area. The British government did not recognize either West Florida or New Orleans as American territory. The historian Frank Owsley suggests that they might have used a victory at New Orleans to demand further concessions from the U.S. However, subsequent research in the correspondence of British ministers at the time suggests otherwise. with specific reference to correspondence from the Prime Minister to the Foreign Secretary dated 23 December 1814. West Florida was the only territory permanently gained by the United States during the war. In 1812, Britain's Royal Navy was the world's largest and most powerful navy, with over 600 vessels in commission, following the defeat of the French Navy at the Battle of Trafalgar in 1805. Most of these ships were employed blockading the French navy and protecting British trade against French privateers, but the Royal Navy still had 85 vessels in American waters, counting all North American and Caribbean waters.[i] However, the Royal Navy's North American squadron was the most immediately available force, based in Halifax and Bermuda (two of the colonies that made up British North America), and numbered one small ship of the line and seven frigates as well as nine smaller sloops and brigs and five schooners. By contrast, the entire United States Navy was composed of 8 frigates, 14 smaller sloops and brigs, with no ships of the line. The United States had embarked on a major shipbuilding program before the war at Sackett's Harbor to provide ships for use on the Great Lakes and continued to produce new ships. The British strategy was to protect their own merchant shipping between Halifax and the West Indies, with the order given on 13 October 1812 to enforce a blockade of major American ports to restrict American trade. Because of their numerical inferiority, the American strategy was to cause disruption through hit-and-run tactics such as the capturing prizes and engaging Royal Navy vessels only under favourable circumstances. Days after the formal declaration of war, the United States put out two small squadrons, including the frigate President and the sloop Hornet under Commodore John Rodgers and the frigates United States and Congress, with the brig Argus under Captain Stephen Decatur. These were initially concentrated as one unit under Rodgers, who intended to force the Royal Navy to concentrate its own ships to prevent isolated units being captured by his powerful force.[citation needed] Large numbers of American merchant ships were returning to the United States with the outbreak of war and the Royal Navy could not watch all the ports on the American seaboard if they were concentrated together. Rodgers' strategy worked in that the Royal Navy concentrated most of its frigates off New York Harbor under Captain Philip Broke, allowing many American ships to reach home. However, Rodgers' own cruise captured only five small merchant ships, and the Americans never subsequently concentrated more than two or three ships together as a unit. The more recently built frigates of the US Navy were intended to overmatch their opponents. The United States did not believe that it could build a large enough navy to contest with the Royal Navy in fleet actions. Therefore, where it could be done, individual ships were built to be tougher, larger, and carry more firepower than their equivalents in European navies.[j] The newest three 44-gun ships were designed with a 24-pounder main battery. These frigates were intended to demolish the 36- to 38-gun (18-pounder) armed frigates that formed the majority of the world's navies, while being able to evade larger ships. Similarly the Wasp class ship-sloops were an over-match to the Cruizer class brigs being employed by the British. The Royal Navy, maintaining more than 600 ships in fleets and stations worldwide, was overstretched and undermanned; most British ships enforcing the blockade were (with a few notable exceptions) less practiced than the crews of the smaller US Navy.[k] This meant that in single-ship actions the Royal Navy ships often found themselves against larger ships with larger crews, who were better drilled, as intended by the US planners.[l] However naval ships do not fight as individuals by the code of the duel, they are national instruments of war and are used as such. The Royal Navy counted on its numbers, experience, and traditions to overcome the individually superior vessels. As the US Navy found itself mostly blockaded by the end of the war, the Royal Navy was correct. For all the fame that these actions received, they in no way affected the outcome of the results of Atlantic theatre of War. The final count of frigates lost was three on each side, with most of the US Navy blockaded in port.[m] During the war, the United States Navy captured 165 British merchantmen (although privateers captured many more) while the Royal Navy captured 1,400 American merchantmen. More significantly, the British blockade of the Atlantic coast caused the majority of warships to be unable to put to sea and shut down both American imports and exports.[n] Notable single-ship engagements include USS Constitution vs HMS Guerriere on 19 August 1812, USS United States vs HMS Macedonian on 25 October, USS Constitution vs HMS Java on 29–30 December, HMS Shannon vs USS Chesapeake on 1 June 1813 (the bloodiest such action of the war), HMS Phoebe vs USS Essex on 28 March 1814, HMS Endymion vs USS President on 15 January 1815. In single ship battles, superior force was the most significant factor. In response to the majority of the American ships being of greater force than the British ships of the same class, Britain constructed five 40-gun, 24-pounder heavy frigates and two "spar-decked" frigates (the 60-gun HMS Leander and HMS Newcastle) and others. To counter the American sloops of war, the British constructed the Cyrus-class ship-sloop of 22 guns. The British Admiralty also instituted a new policy that the three American heavy frigates should not be engaged except by a ship of the line or frigates in squadron strength.[o] The United States Navy's smaller ship-sloops had also won several victories over Royal Navy sloops-of-war, again of smaller armament. The American sloops Hornet, Wasp (1807), Peacock, Wasp (1813) and Frolic were all ship-rigged while the British Cruizer-class sloops that they encountered were brig-rigged, which gave the Americans a significant advantage. Ship rigged vessels are more manoeuvrable in battle because they have a wider variety of sails and thus being more resistant to damage. Ship-rigged vessels can back sail, literally backing up or heave to (stop).[p] The operations of American privateers proved a more significant threat to British trade than the United States Navy. They operated throughout the Atlantic until the close of the war, most notably from Baltimore. American privateers reported taking 1300 British merchant vessels, compared to 254 taken by the United States Navy, although the insurer Lloyd's of London reported that only 1,175 British ships were taken, 373 of which were recaptured, for a total loss of 802. Canadian historian Carl Benn wrote that American privateers took 1,344 British ships, of which 750 were retaken by the British. The British tried to limit privateering losses by the strict enforcement of convoy by the Royal Navy and directly by blockading coastal waterways and capturing 278 American privateers. Due to the massive size of the British merchant fleet, American captures only affected 7.5% of the fleet, resulting in no supply shortages or lack of reinforcements for British forces in North America. Of 526 American privateers, 148 were captured by the Royal Navy and only 207 ever took a prize. Due to the large size of their navy, the British did not rely as much on privateering. The majority of the 1,407 captured American merchant ships were taken by the Royal Navy. The war was the last time the British allowed privateering, since the practice was coming to be seen as politically inexpedient and of diminishing value in maintaining its naval supremacy. However, privateering remained popular in British colonies. It was the last hurrah for privateers in the insular British North American colony of Bermuda who vigorously returned to the practice with experience gained in previous wars. The nimble Bermuda sloops captured 298 American ships. Privateer schooners based in continental British North America, especially from Nova Scotia, took 250 American ships and proved especially effective in crippling American coastal trade and capturing American ships closer to shore than the Royal Navy's cruisers. The naval blockade of the United States began informally in the late fall of 1812. Under the command of British Admiral John Borlase Warren, it extended from South Carolina to Florida. It expanded to cut off more ports as the war progressed. Twenty ships were on station in 1812 and 135 were in place by the end of the conflict. In March 1813, the Royal Navy punished the Southern states, who were most vocal about annexing British North America, by blockading Charleston, Port Royal, Savannah, and New York City as well. Additional ships were sent to North America in 1813 and the Royal Navy tightened and extended the blockade, first to the coast south of Narragansett by November 1813 and to the entire American coast on 31 May 1814. In May 1814, following the abdication of Napoleon and the end of the supply problems with Wellington's army, New England was blockaded. The British needed American foodstuffs for their army in Spain and benefited from trade with New England, so they did not at first blockade New England. The Delaware River and Chesapeake Bay were declared in a state of blockade on 26 December 1812. Illicit trade was carried on by collusive captures arranged between American traders and British officers. American ships were fraudulently transferred to neutral flags. Eventually, the United States government was driven to issue orders to stop illicit trading. This put only a further strain on the commerce of the country. The British fleet occupied the Chesapeake Bay and attacked and destroyed numerous docks and harbours. The effect was that no foreign goods could enter the United States on ships and only smaller fast boats could attempt to get out. The cost of shipping became very expensive as a result.[q] The blockade of American ports later tightened to the extent that most American merchant ships and naval vessels were confined to port. The American frigates USS United States and USS Macedonian ended the war blockaded and hulked in New London, Connecticut. USS United States and USS Macedonian attempted to set sail to raid British shipping in the Caribbean, but were forced to turn back when confronted with a British squadron, and by the end of the war, the United States had six frigates and four ships-of-the-line sitting in port. Some merchant ships were based in Europe or Asia and continued operations. Others, mainly from New England, were issued licences to trade by Admiral Warren, commander in chief on the American station in 1813. This allowed Wellington's army in Spain to receive American goods and to maintain the New Englanders' opposition to the war. The blockade nevertheless decreased American exports from $130 million in 1807 to $7 million in 1814. Most exports were goods that ironically went to supply their enemies in Britain or the British colonies. The blockade had a devastating effect on the American economy with the value of American exports and imports falling from $114 million in 1811 down to $20 million by 1814 while the United States Customs took in $13 million in 1811 and $6 million in 1814, even though the Congress had voted to double the rates. The British blockade further damaged the American economy by forcing merchants to abandon the cheap and fast coastal trade to the slow and more expensive inland roads. In 1814, only 1 out of 14 American merchantmen risked leaving port as it was likely that any ship leaving port would be seized. As the Royal Navy base that supervised the blockade, Halifax profited greatly during the war. From there, British privateers seized and sold many French and American ships. More than a hundred prize vessels were anchored in St. George's Harbour awaiting condemnation by the Admiralty Court when a hurricane struck in 1815, sinking roughly sixty of the vessels. The British Royal Navy's blockades and raids allowed about 4,000 African Americans to escape slavery by fleeing American plantations aboard British ships. American slaves near to the British military rebelled against their masters and made their way to British encampments. The migrants who settled in Canada were known as the Black Refugees. The blockading British fleet in the Chesapeake Bay received increasing numbers of freed slaves during 1813. By British government order, they were considered free persons when they reached British hands. Alexander Cochrane's proclamation of 2 April 1814 invited Americans who wished to emigrate to join the British. Although it did not explicitly mention slaves, it was taken by all as addressed to them. About 2,400 escaped slaves and their families were transported by the Royal Navy to the Royal Naval Dockyard at Bermuda (where they were employed on works about the yard and organized as a militia to aid in the defence of the yard), Nova Scotia and New Brunswick during and after the war. Starting in May 1814, younger male volunteers were recruited into a new Corps of Colonial Marines. They fought for Britain throughout the Atlantic campaign, including the Battle of Bladensburg, the attacks on Washington, D.C., and the Battle of Baltimore, before withdrawing to Bermuda with the rest of the British forces. They were later settled in Trinidad after having rejected orders for transfer to the West India Regiments, forming the community of the Merikins (none of the freed slaves remained in Bermuda after the war). These escaped slaves represented the largest emancipation of African Americans prior to the American Civil War. Britain paid the United States for the financial loss of the slaves at the end of the war. Treaty of Ghent In August 1814, peace discussions began in Ghent (in present-day Belgium); both sides approached negotiations warily.[r] The British strategy for decades had been to create a buffer state in the American Northwest Territory to block American expansion. Britain also demanded naval control of the Great Lakes and access to the Mississippi River. On the American side, Monroe instructed the American diplomats sent to Europe to try to convince the British to cede the Canadas, or at least Upper Canada, to the U.S. At a later stage, the Americans also demanded damages for the burning of Washington and for the seizure of ships before the war began. American public opinion was outraged when Madison published the demands as even the Federalists were now willing to fight on. A British force burned Washington, but it failed to capture Baltimore and sailed away when its commander was killed. In northern New York State, 10,000 British veterans were marching south until a decisive defeat at the Battle of Plattsburgh forced them back to Canada.[s] British Prime Minister Lord Liverpool, aware of growing opposition to wartime taxation and the demands of merchants for reopened trade with America, realized Britain also had little to gain and much to lose from prolonged warfare especially given growing concern about the situation in Europe. The main focus of British foreign policy was the Congress of Vienna, at which British diplomats had clashed with Russian and Prussian diplomats over the terms of the peace with France and there were fears that Britain might have to go to war with Russia and Prussia. Export trade was all but paralyzed and France was no longer an enemy of Britain after Napoleon fell in April 1814, so the Royal Navy no longer needed to stop American shipments to France and it no longer needed to impress more seamen. The British were preoccupied in rebuilding Europe after the apparent final defeat of Napoleon. Consequently, Lord Liverpool urged the British negotiators to offer a peace based on the restoration of the pre-war status quo. The British negotiators duly dropped their demands for the creation of an Indian neutral zone, which allowed negotiations to resume at the end of October. The American negotiators accepted the British proposals for a peace based on the pre-war status quo. Prisoners were to be exchanged and escaped slaves returned to the United States, as at least 3,000 American slaves had escaped to British lines. The British however refused to honour this aspect of the treaty, settling some of the newly freed slaves in Nova Scotia and New Brunswick. The Americans protested Britain's failure to return American slaves in violation of the Treaty of Ghent. After arbitration by the Tsar of Russia the British paid $1,204,960 in damages to Washington, to reimburse the slave owners. On 24 December 1814, the diplomats had finished and signed the Treaty of Ghent. The treaty was ratified by the British Prince Regent three days later on 27 December. On 17 February, it arrived in Washington, where it was quickly ratified and went into effect, ending the war. The terms called for all occupied territory to be returned, the prewar boundary between Canada and the United States to be restored, and the Americans were to gain fishing rights in the Gulf of Saint Lawrence.[citation needed] The British insisted on the inclusion of provisions to restore to the Indians "all possessions, rights and privileges which they may have enjoyed, or been entitled to in 1811". The Americans ignored and violated these provisions. The Treaty of Ghent completely maintained Britain's maritime belligerent rights, a key goal for the British, without acknowledging American maritime rights or the end of impressment. While American maritime rights were not seriously violated in the century of peace until World War I, the defeat of Napoleon made the need for impressment irrelevant and the grievances of the United States no longer an issue. In this sense, the United States achieved its goals indirectly and felt its honour had been upheld despite impressment continuing. Losses and compensation Total losses from among various Indigenous nations, including those wounded and missing, are not known though the number of killed or died from disease has been estimated as at least 10,000 spread among all Indigenous nations engaged in the conflict. British losses in the war were about 1,160 killed in action and 3,679 wounded,[citation needed] with 3,321 British who died from disease. American losses were 2,260 killed in action and 4,505 wounded. While the number of Americans who died from disease is not known, it is estimated that about 15,000 died from all causes directly related to the war. Known Canadian dead, including Canadians who served in Fencible units of the British Army and with Canadian militia units in Upper and Lower Canada, are greater than 1,600. The war added some £25 million to Britain's national debt. In the United States, the cost was $90 million reaching a peak of 2.7% of GDP. The national debt rose from $45 million in 1812 to $127 million by the end of 1815, although by selling bonds and treasury notes at deep discounts – and often for irredeemable paper money due to the suspension of specie payment in 1814 – the government received only $34 million worth of specie. Stephen Girard, the richest man in the United States at the time, was among those who funded the United States government's involvement in the war. The British national debt rose from £451 million in 1812 to £841 million in 1814, although this was at a time when Britain was fighting a war against Napoleon. The war was bad for both economies. In the United States, the economy grew 3.7% a year from 1812 to 1815, despite a large loss of business by East Coast shipping interests. Prices were 15% higher – inflated – in 1815 compared to 1812, an annual rate of 4.8%. Hundreds of new banks were opened; they largely handled the loans that financed the war since tax revenues were down. Money that would have been spent on foreign trade was diverted to opening new factories, which were profitable since British factory-made products were not for sale. This gave a major boost to the Industrial Revolution in the United States as typified by the Boston Associates. Long-term consequences The border between the United States and Canada remained essentially unchanged by the war, with neither side making meaningful territorial gains.[t] Despite the Treaty of Ghent not addressing the original points of contention and establishing the status quo ante bellum, relations between the United States and Britain changed drastically. The issue of impressment also became irrelevant as the Royal Navy no longer needed sailors after the war.[citation needed] The long-term results of the war were generally satisfactory for both the United States and Great Britain. Except for occasional border disputes and some tensions during and after the American Civil War, relations between the United States and Britain remained peaceful for the rest of the 19th century. In the 20th century, spurred by multiple world conflicts, the two countries became close allies. The memory of the conflict played a major role in helping to consolidate a Canadian national identity after 1867, the year of Canadian confederation. The Rush–Bagot Treaty between the United States and Britain was enacted in 1817. It demilitarized the Great Lakes and Lake Champlain, where many British naval arrangements and forts still remained. The treaty laid the basis for a demilitarized boundary. It remains in effect to this day. Bermuda had been largely left to the defences of its own militia and privateers before American independence, but the Royal Navy had begun buying up land and operating from there beginning in 1795, after a number of years spent surveying the reefs to find Hurd's channel (which enabled large frigates and ships of the line to pass through the surrounding reefs to Murray's Anchorage and the enclosed harbours). As construction work progressed through the first half of the 19th century, Bermuda became an Imperial fortress and the permanent naval headquarters for the Western hemisphere, housing the Admiralty and serving as a base and dockyard. Defence infrastructure remained the central leg of Bermuda's economy until after World War II. After the war, pro-British leaders in Upper Canada demonstrated a strong hostility to American influences, including republicanism, which shaped its policies. Immigration from the United States was discouraged and favour was shown to the Anglican Church as opposed to the more Americanized Methodist Church. The Battle of York showed the vulnerability of Upper and Lower Canada (The Canadas). In the decades following the war, several projects were undertaken to improve the defence of the colonies against the United States. They included work on La Citadelle at Quebec City, Fort Henry at Kingston, and rebuilding Fort York at York. Additionally, work began on the Halifax Citadel to defend the port against foreign navies. Akin to the American view that it was a "Second War of Independence" for the United States, the war was also somewhat of a war of independence for Canada. Before the war Canada was a mix of French Canadians, native-born British subjects, loyalists and Americans who migrated there. Historian Donald R. Hickey maintains that the war that threatened Canada greatly helped to cement these disparate groups into a unified nation. The Indigenous tribes allied to the British lost their cause. The Americans rejected the British proposal to create an "Indian barrier state" in the American West at the Ghent peace conference and it never resurfaced. Donald Fixico argues that "[a]fter the War of 1812, the U.S. negotiated over two hundred Indian treaties that involved the ceding of Indian lands and 99 of these agreements resulted in the creation of reservations west of the Mississippi River". The Indigenous nations lost most of their fur-trapping territory. Indigenous nations were displaced in Alabama, Georgia, New York and Oklahoma, losing most of what is now Indiana, Michigan, Ohio and Wisconsin within the Northwest Territory as well as in New York and the South. They came to be seen as an undesirable burden by British policymakers, who now looked to the United States for markets and raw materials. Everyone, including British fur traders were prohibited from entering in the United States for purposes of trade. British Indian agents however continued to meet regularly with their former allies among the tribes of the Old Northwest, but refused to supply them with arms or help them resist American attempts to displace them. The American government rapidly built a network of forts throughout the Old Northwest, thus establishing firm military control. It also sponsored American fur traders, who outcompeted the British fur traders. Meanwhile, Euro-American settlers rapidly migrated into the Old Northwest, into the lands occupied by the tribes who were previously allied with the British. The War of 1812 marked a turning point in the history of the Old Northwest because it established United States authority over the British and Indians of that border region. After the decisive defeat of the Creek Indians at the Battle of Horseshoe Bend in 1814, some Creek warriors escaped to join the Seminole in Florida.[citation needed] The remaining Creek chiefs signed away about half their lands, comprising 23,000,000 acres (9,300,000 ha), covering much of southern Georgia and two-thirds of modern Alabama. The Creek were separated from any future help from the Spanish in Florida and from the Choctaw and Chickasaw to the west. The war is seldom remembered in the United Kingdom. The war in Europe against the French Empire under Napoleon ensured that the British did not consider the War of 1812 against the United States as more than a sideshow. Britain's blockade of French trade had worked, and the Royal Navy was the world's dominant nautical power (and remained so for over another century). While the land campaigns had contributed to saving Canada, the Royal Navy had shut down American commerce, bottled up the United States Navy in port and widely suppressed privateering. British businesses, some affected by rising insurance costs, were demanding peace so that trade could resume with the United States. The peace was generally welcomed by the British, although there was disquiet about the rapid growth of the United States. The two nations quickly resumed trade after the end of the war and a growing friendship. The historian Donald Hickey maintains that for Britain, "the best way to defend Canada was to accommodate the United States. This was the principal rationale for Britain's long-term policy of rapprochement with the United States in the nineteenth century and explains why they were so often willing to sacrifice other imperial interests to keep the republic happy". The nation gained a strong sense of complete independence as people celebrated their "second war of independence". Nationalism soared after the victory at the Battle of New Orleans. The opposition Federalist Party collapsed due to its opposition to the war and the Era of Good Feelings ensued. No longer questioning the need for a strong Navy, the United States built three new 74-gun ships of the line and two new 44-gun frigates shortly after the end of the war. In 1816, the United States Congress passed into law an "Act for the gradual increase of the Navy" at a cost of $1,000,000 a year for eight years, authorizing nine ships of the line and 12 heavy frigates. The captains and commodores of the Navy became the heroes of their generation in the United States. Several war heroes used their fame to win elections to national office. Andrew Jackson and William Henry Harrison both benefited from their military successes to win the presidency, while representative Richard Mentor Johnson's role during the war helped him attain the vice presidency. During the war, New England states became increasingly frustrated over how the war was being conducted and how the conflict affected them. They complained that the United States government was not investing enough militarily and financially in the states' defences and that the states should have more control over their militias. Increased taxes, the British blockade, and the occupation of some of New England by enemy forces also agitated public opinion in the states. At the Hartford Convention held between December 1814 and January 1815, Federalist delegates deprecated the war effort and sought more autonomy for the New England states. They did not call for secession, but word of the angry anti-war resolutions appeared as peace was announced and the victory at New Orleans was known. The upshot was that the Federalists were permanently discredited and quickly disappeared as a major political force. This war enabled thousands of slaves to escape to freedom, despite the difficulties. The British helped numerous escaped slaves resettle in New Brunswick and Nova Scotia, where Black Loyalists had also been granted land after the American Revolutionary War. Jackson invaded Florida (then part of New Spain) in 1818, demonstrating to Spain that it could no longer control that colonial territory with a small force. Spain sold Florida to the United States in 1819 under the Adams–Onís Treaty following the First Seminole War. Pratt concludes that "[t]hus indirectly the War of 1812 brought about the acquisition of Florida". Historiography The historiography of the War of 1812 reflects the numerous interpretations of the conflict, especially in reference to the war's outcome. The historical record has interpreted both the British and Americans as victors in the conflict, with substantial academic and popular literature published to support each claim. The British viewed the War of 1812 as a minor theatre that was overshadowed by key victories at the Battle of Trafalgar in 1805 and the Battle of Waterloo in 1815, leading to the Pax Britannica. In the United States and Upper Canada, nationalistic mythology around the war took hold following its conclusion.[u] With the failure of the invasion of British Canada advancing the concept of Canadian identity, Canada remained a distinct region that would continue to evolve into a nation. Americans were able to enforce their sovereignty, and both the restoration of honor and what has been called the Second War of Independence are important themes in American historiography, and are considered significant results by historians. Indigenous nations are generally held to have lost in the war. See also Notes References Bibliography Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Aibell] | [TOKENS: 1977]
Contents Aibell In Irish legend, Aibell (sometimes Aoibheall (modern Irish spelling)), also anglicised as Aeval or Eevill) was a Pre-Christian goddess from the Irish mythology of Munster and the guardian spirit of the Dál gCais, the Delbhna, and the Clan Ó Bríen. She was demoted in popular belief, following the Christianisation of Gaelic Ireland, from a goddess to the Fairy Queen ruling over the Celtic Otherworld of Thomond, or north Munster. The entrance to her kingdom was believed to be at Craig Liath, the grey rock, a hill overlooking the Shannon about two miles north of Killaloe. Aibell also had a lover (called Dubhlainn Ua Artigan) and a magic harp (of which it was said "[w]hoever heard its music did not live long afterwards"). In Irish folklore, she was turned into a white cat by her sister, Clíodhna and is alleged to have appeared in a dream on the night before the Battle of Clontarf to Brian Boru, High King of Ireland, and prophesied his imminent death and that whichever of his sons he saw first would succeed him. In Modern literature in Irish, Aibell appears in many immortal 18th century Aisling poems composed in Munster Irish. Aibell also serves as the main antagonist in the very famous long comic poem (Irish: Cúirt an Mheán Oíche, "The Midnight Court") by Brian Merriman, in which she is the presiding judge during an Otherworldly lawsuit, in which the women of Ireland are suing the men for refusing to marry and father children. Name The name Aoibheall may come from Gaelic aoibh, meaning "beauty" (or aoibhinn "beautiful"). Alternatively, as a theonym it could be derived from Proto-Celtic *Oibel-ā, literally "burning fire", which may have been a byword for the notion of "ardour"; the Romano-British equivalent of this Proto-Celtic theonym is likely to have been *Oebla. A variant name for the character is Áebinn. Attestations In Seán Ó Seanacháin's song An Buachaill Caol Dubh, Aoibheal appears to the "Dark Slender Boy" (representing alcohol addiction) and his friend the drinker. In the last verse, Seanacháin expands by saying that, when Aoibheal met the two of them walking the road, she promised the lad a hundred men if he would let go of the poet. The lad replied that he was steadfast and true and would not desert his friends until they died. Thus Seán acknowledges his addiction will never disappear. AND Aoibhell, another woman of the Sidhe, made her dwelling-place in Craig Liath, and at the time of the battle of Cluantarbh she set her love on a young man of Munster, Dubhlaing ua Artigan, that had been sent away in disgrace by the King of Ireland. But before the battle he came back to join with Murchadh, the king's son, and to fight for the Gael. And Aoibhell came to stop him; and when he would not stop with her she put a Druid covering about him, that way no one could see him. And he went where Murchadh was fighting, and he made a great attack on the enemies of Ireland, and struck them down on every side. And Murchadh looked around him, and he said: "It seems to me I hear the sound of the blows of Dubhlaing ua Artigan, but I do not see himself." Then Dubhlaing threw off the Druid covering that was about him, and he said: 'I will not keep this covering upon me when you cannot see me through it. And come now across the plain to where Aoibbell is," he said, "for she can give us news of the battle." So they went where she was, and she bade them both to quit the battle, for they would lose their lives in it. But Murchadh said to her, "I will tell you a little true story," he said; "that fear for my own body will never make me change my face. And if we fall," he said, "the strangers will fall with us; and it is many a man will fall by my own hand, and the Gael will be sharing their strong places." "Stop with me, Dubhlaing," she said then, "and you will have two hundred years of happy life with myself." "I will not give up Murchadh," he said, "or my own good name, for silver or gold." And there was anger on Aoibhell when he said that, and she said: "Murchadh will fall, and you yourself will fall, and your proud blood will be on the plain tomorrow." And they went back into the battle, and got their death there. And it was Aoibhell gave a golden harp to the son of Meardha the time he was getting his learning at the school of the Sidhe in Connacht and that he heard his father had got his death by the King of Lochlann. And whoever heard the playing of that harp would not live long after it. And Meardha's son went where the three sons of the King of Lochlann were, and played on his harp for them, and they died. It was that harp Cuchulain heard the time his enemies were gathering against him at Muirthemne, and he knew by it that his life was near its end. Aoibheal also features prominently in the 18th-century comic poem Cúirt An Mheán Oíche by Brian Merriman. The poem begins by using the conventions of the Aisling, or vision poem, in which the poet is out walking when he has a vision of a woman from the other world. Typically, this woman is Ireland and the poem will lament her lot and/or call on her 'sons' to rebel against foreign tyranny. In Merriman's hands, the convention is made to take a satirical and deeply ironic twist. In the opening section of the poem, a hideous female giant appears to the poet and drags him kicking and screaming to the court of Queen Aoibheal of the Fairies. On the way to the ruined monastery at Moinmoy, the messenger explains that the Queen, disgusted by the twin corruptions of Anglo-Irish landlords and English Law, has taken the dispensing of justice upon herself. There follows a traditional court case under the Brehon law form of a three-part debate. In the first part, a young woman calls on Aoibheal declares her case against the young men of Ireland for their refusal to marry. She complains that, despite increasingly desperate attempts to capture a husband via intensive flirtation at hurling matches, wakes, and pattern days, the young men insist on ignoring her in favour of late marriages to much older women. The young woman further bewails the contempt with which she is treated by the married women of the village. She is answered by an old man who first denounces the wanton promiscuity of young women in general, suggesting that the young woman who spoke before was conceived by a Tinker under a cart. He vividly describes the infidelity of his own young wife. He declares his humiliation at finding her already pregnant on their wedding night and the gossip which has surrounded the "premature" birth of "his" son ever since. He disgustedly attacks the dissolute lifestyles of young women in general. Then, however, he declares that there is nothing wrong with his illegitimate children and denounces marriage as "out of date." He demands that the Queen outlaw it altogether and replace it with a system of free love. The young woman, however, is infuriated by the old' man's words and is barely restrained from physically attacking him. She mocks his impotent failure to fulfill his marital duties with his young wife, who was a homeless beggar who married him to avoid starvation. The young woman then argues that if his wife has taken a lover, she well deserves one. The young woman then calls for the abolition of priestly celibacy, alleging that priests would otherwise make wonderful husbands and fathers. In the meantime, however, she will keep trying to attract an older man in hopes that her unmarried humiliation will finally end. Finally, in the judgement section Queen Aoibheal rules that all laymen must marry before the age of 21, on pain of corporal punishment at the hands of Ireland's women. She advises them to equally target the romantically indifferent, homosexuals, and skirt chasers who boast of the number of women they have used and discarded. Aoibheal tells them to be careful, however, not to leave any man unable to father children. She also states that abolishing priestly celibacy is something only the Vatican can do and counsels patience. To the poet's horror, the younger woman angrily points him out as a 30-year-old bachelor and describes her many failed attempts to attract his interest in hopes of becoming his wife. She declares that he must be the first man to suffer the consequences of the new marriage law. As a crowd of infuriated women prepares to flog him into a quivering bowl of jelly, he awakens to find it was all a terrible nightmare. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_note-96] | [TOKENS: 6011]
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/RCA_Corporation] | [TOKENS: 9394]
Contents RCA Corporation RCA Corporation, founded as the Radio Corporation of America, was a major American electronics company in existence from 1919 to 1987. Initially, RCA was a patent trust owned by a partnership of General Electric (GE), Westinghouse, AT&T Corporation and United Fruit Company. It became an independent company in 1932 after the partners agreed to divest their ownerships in settling an antitrust lawsuit by the United States. An innovative and progressive company, RCA was the dominant electronics and communications firm in the United States for over five decades. In the early 1920s, RCA was at the forefront of the mushrooming radio industry, both as a major manufacturer of radio receivers and as the exclusive manufacturer of the first superheterodyne receiver. In 1926, the company founded the National Broadcasting Company (NBC), the first nationwide radio network. During the '20s and '30s RCA also pioneered the introduction and development of broadcast television—both black and white and especially color television. Throughout most of its existence, RCA was closely identified with the leadership of David Sarnoff. He became general manager at the company's founding, served as president from 1930 to 1965, and remained active as chairman of the board until the end of 1969. Until the 1970s, RCA maintained a seemingly impregnable stature as corporate America's leading name in technology, innovation, and home entertainment. However, the company's performance began to weaken as it expanded beyond its original focus—developing and marketing consumer electronics and communications in the US—towards the larger goal of operating as a diversified multinational conglomerate. The company also faced increasing domestic competition from international electronics firms such as Sony, Philips, Matsushita and Mitsubishi. RCA suffered enormous financial losses attempting to enter the mainframe computer industry, and in other failed projects including the CED videodisc system. By the mid 1980s, RCA was rebounding but the company was never able to regain its former eminence. In 1986, RCA was reacquired by General Electric during the Jack Welch era at GE. Welch sold or liquidated most of RCA's assets, retaining only NBC and some government services units. Today, RCA exists as a brand name only; the various RCA trademarks are currently owned by Sony Music Entertainment and Vantiva, which in turn license the RCA brand name and trademarks for various products to several other companies, including Voxx International, Curtis International, AVC Multimedia, TCL Corporation, and Express LUCK International. Establishment by General Electric RCA originated as a reorganization of the Marconi Wireless Telegraph Company of America, commonly called "American Marconi". In 1897, the Wireless Telegraph and Signal Company, Limited, was founded in London to promote the radio—then known as "wireless telegraphy"—inventions of Guglielmo Marconi. As part of worldwide expansion, American Marconi was organized as a subsidiary in 1899, holding the rights to Marconi patents in the United States and Cuba. In 1912, American Marconi took control of the assets of the bankrupt United Wireless Telegraph Company, and from that point forward, was the dominant radio communications company in the United States. When the United States entered World War I in April 1917, the federal government took control of most civilian radio stations to use them for the war effort. Although the government planned to restore civilian ownership of the radio stations once the war ended, many United States Navy officials hoped to retain a monopoly on radio communication even after the war. Contrary to instructions it had received, the Navy began purchasing large numbers of radio stations. When the war ended, Congress rejected the Navy's efforts to have peacetime control of the radio industry and instructed that the Navy return the stations to the original owners. Due to national security considerations, the Navy was particularly concerned about returning high-powered international stations to American Marconi, since the majority of its stock was in foreign hands, and the British already largely controlled the international undersea telegraph cables. This concern was increased by the announcement in late 1918 of the formation of the Pan-American Wireless Telegraph and Telephone Company, a joint venture between American Marconi and the Federal Telegraph Company, with plans to set up service between the United States and South America. The Navy had installed a high-powered Alexanderson alternator, built by General Electric (GE), at the American Marconi transmitter site in New Brunswick, New Jersey. It proved to be superior for transatlantic transmissions to the spark-gap transmitters that had been traditionally used by the Marconi companies. Marconi officials were so impressed by the capabilities of the Alexanderson alternators that they began making preparations to adopt them as their standard transmitters for international communication. A tentative plan made with General Electric proposed that over a two-year period the Marconi companies would purchase most of GE's alternator production. However, the U.S. Navy objected to the plan, fearing British domination in international radio communications and the national security concerns this raised. The Navy, claiming support from U.S. President Woodrow Wilson, looked for an alternative that would result in an "all-American" company taking over the American Marconi assets. In April 1919, two naval officers, Admiral H. G. Bullard and Commander S. C. Hooper, met with GE president, Owen D. Young and requested a suspension of the pending alternator sales to the Marconi companies. This would leave General Electric without a buyer for its transmitters, so the officers proposed that GE purchase American Marconi, and use the assets to form its own radio communications subsidiary. Young consented to this proposal, which, effective November 20, 1919, transformed American Marconi into the Radio Corporation of America. The decision to form the new company was promoted as a patriotic gesture. The corporate officers were required to be citizens of the United States, with a majority of the company stock to be held by U.S. citizens. Upon its founding, RCA was the largest radio communications firm in the United States. Most of the former American Marconi staff continued to work for RCA. Owen Young became the chairman of the board of the new company. Former American Marconi vice president and general manager E. J. Nally become RCA's first president. Nally was succeeded by Major General James G. Harbord, who served from 1922 until January 3, 1930, when Harbord replaced Owen Young as chairman of the board. David Sarnoff, who was RCA's founding general manager, became its third president on the same day. RCA worked closely with the federal government and felt it deserved to maintain its predominant role in U.S. radio communications. At the company's recommendation, President Wilson appointed Rear Admiral Bullard "to attend the stockholders' and director's meetings... in order that he may present and discuss informally the Government's views and interests". The radio industry had been making technical advances, particularly in the area of vacuum tube technology and GE needed access to additional patents before its new subsidiary could be fully competitive. During this time American Marconi had been steadily falling behind others in the industry. The two companies then proceeded to negotiate a series of mutually beneficial cross-licensing agreements between themselves and various other companies in the industry. On July 1, 1920, the American Telephone & Telegraph Company (AT&T), agreed to purchase 500,000 shares of RCA—although it divested these shares in early 1923. The United Fruit Company held a small portfolio of radio patents, and signed two agreements in 1921. GE's traditional electric company rival, the Westinghouse Electric & Manufacturing Corporation, had also purchased rights to some critical patents, including one for heterodyne receiving originally issued to Reginald Fessenden, plus regenerative circuit and superheterodyne receiver patents issued to Edwin Armstrong. Westinghouse used this position to negotiate a cross-licensing agreement, effective July 1, 1921, that included a concession that 40% of RCA's equipment purchases would be from Westinghouse. Following these transactions, GE owned 30.1% of RCA's stock, Westinghouse 20.6%, AT&T 10.3%, and United Fruit 4.1%, with the remaining 34.9% owned by individual shareholders. In 1930, RCA agreed to occupy the yet-to-be-constructed landmark skyscraper of the Rockefeller Center complex, 30 Rockefeller Plaza, which in 1933 became known as the RCA Building (renamed the GE Building in 1988 and currently known as the Comcast Building after Comcast acquired NBC). This lease was critical for enabling the massive project to proceed as a commercially viable venture—David Rockefeller cited RCA's action as being responsible for "the salvation of the project". Radio development RCA's primary business objectives at its founding were to provide equipment and services for seagoing vessels, and "worldwide wireless" communication in competition with existing international undersea telegraph cables. To provide the international service, the company soon undertook a massive project to build a "Radio Central" communications hub at Rocky Point, Long Island, New York, designed to achieve "the realization of the vision of communication engineers to transmit messages to all points of the world from a single centrally located source". Construction began in July 1920, and the site was dedicated on November 5, 1921, after two of the proposed twelve antenna spokes had been completed, and two of the 200-kilowatt alternators installed. The debut transmissions received replies from stations in 17 countries. Although the initial installation would remain in operation, the additional antenna spokes and alternator installations would not be completed, due to a major discovery about radio signal propagation. While investigating transmitter "harmonics" – unwanted additional radio signals produced at higher frequencies than a station's normal transmission frequency – Westinghouse's Frank Conrad unexpectedly found that in some cases the harmonics could be heard farther than the primary signal, something previously thought impossible, as high-frequency shortwave signals, which had poor groundwave coverage, were thought to have a very limited transmission range. In 1924, Conrad demonstrated to Sarnoff that a low-powered shortwave station in East Pittsburgh, Pennsylvania could be readily received in London by a simple receiver using a curtain rod as an antenna, matching, at a small fraction of the cost, the performance of the massive alternator transmitters. In 1926, Harold H. Beverage further reported that a shortwave signal, transmitted on a 15-meter wavelength (approximately 20 MHz), was received in South America more readily during the daytime than the 200-kilowatt alternator transmissions. The Alexanderson alternators, control of which had led to RCA's formation, were now considered obsolete, and international radio communication would be primarily conducted using vacuum tube transmitters operating on shortwave bands. RCA would continue to operate international telecommunications services for the remainder of its existence, through its subsidiary RCA Communications, Inc., and later the RCA Global Communications Company. In 1975, the company formed RCA American Communications, which operated its Satcom series of geostationary communications satellites. International shortwave links were in turn largely supplanted by communications satellites, especially for distributing network radio and television programming. At the time RCA was founded in 1919, all radio and telegraphic communication between China and the US, including official messages, were sent through either German radio or British cable links. The U.S. Navy lobbied RCA to seek a concession for a radio link to China, however the company was reluctant because its other concessions were already operating at a loss. This link began operation in 1928. The Mackay Radio and Telegraph Company of California signed a similar agreement with China in 1932. RCA claimed this was breach of contract on the grounds that its 1928 agreement had given it exclusive rights. The dispute went to arbitration, and in 1935 a decision, issued in Radio Corporation of America v China, concluded the Mackay concession was valid, because the earlier RCA concession had not granted exclusive rights. The introduction of organized radio broadcasting in the early 1920s resulted in a dramatic reorientation and expansion of RCA's business activities. The development of vacuum tube radio transmitters made audio transmissions practical, in contrast with the earlier transmitters which were limited to sending the dits-and-dahs of Morse code. Since at least 1916, when he was still at American Marconi, David Sarnoff had proposed establishing broadcasting stations, but his memos to management promoting the idea for sales of a "Radio Music Box" had not been followed up at the time. A small number of broadcasting stations began operating, and soon interest in the innovation was spreading nationwide. In the summer of 1921, a Madison Square Garden employee, Julius Hopp, devised a plan to raise charitable funds by broadcasting, from ringside, the July 2, 1921 Dempsey-Carpentier heavyweight championship fight to be held in Jersey City, New Jersey. Hopp recruited theaters and halls as listening locations that would charge admission fees to be used as charitable donations. He also contacted RCA's J. Andrew White, the acting president of the National Amateur Wireless Association (NAWA), an organization originally formed by American Marconi which had been inherited by RCA. White agreed to recruit the NAWA membership for volunteers to provide assistance at the listening sites, and also enlisted David Sarnoff for financial and technical support. RCA was authorized to set up a temporary longwave radio station, located in Hoboken a short distance from the match site, and operating under the call letters WJY. For the broadcast White and Sarnoff telephoned commentary from ringside, which was typed up and then read over the air by J. Owen Smith. The demonstration was a technical success, with a claimed audience of 300,000 listeners throughout the northeast. RCA quickly moved to expand its broadcasting activities. In the fall of 1921, it set up its first full-time broadcasting station, WDY, at the Roselle Park, New Jersey company plant. By 1923, RCA was operating three stations—WJZ (now WABC) and WJY in New York City, and WRC (now WTEM) in Washington, D.C. A restriction imposed by AT&T's interpretation of the patent cross-licensing agreements required that the RCA stations remain commercial free, and they were financed by profits from radio equipment sales. Beginning in 1922, AT&T became heavily involved in radio broadcasting, and soon was the new industry's most important participant. From the beginning, AT&T's policy was to finance stations by commercial sponsorship of the programs. The company also created the first radio network, centered on its New York City station WEAF (now WFAN), using its long-distance telephone lines to interconnect stations. This allowed them to economize by having multiple stations carry the same program. RCA and its partners soon faced an economic crisis, as the costs of providing programming threatened to exceed the funds available from equipment profits. In 1926, AT&T transferred its broadcasting related activities into a new subsidiary, the Broadcasting Company of America (BCA), with its primary assets consisting of broadcasting stations WEAF in New York City and WCAP in Washington, D.C., plus its network operations. Two months later, AT&T unexpectedly decided to exit the radio broadcasting field, and RCA purchased the BCA subsidiary for $1,000,000. These assets formed the basis for the creation of the National Broadcasting Company (NBC), with ownership divided between RCA (50%), General Electric (30%), and Westinghouse (20%) until 1930, when RCA assumed 100% ownership. This purchase also included the right to begin commercial operations. NBC formed two radio networks that eventually expanded nationwide: the NBC-Red Network, with flagship station WEAF, and NBC-Blue, centered on WJZ. Although NBC was originally promoted as expecting to just break even economically, it soon became extremely profitable, which would be an important factor in helping RCA survive the economic pressures of the Great Depression that began in late 1929. Concerned that NBC's control of two national radio networks gave it too much power over the industry, in 1941 the Federal Communications Commission (FCC) issued an industry review, Report on Chain Broadcasting, which included a rule designed to force NBC to divest one of them. This order was upheld by the U.S. Supreme Court, and on October 12, 1943, the NBC-Blue network was sold to candy magnate Edward J. Noble for $8,000,000, and renamed "The Blue Network, Inc." In 1946 the name was changed to the American Broadcasting Company (ABC). The "Red" network retained the NBC name and remained under RCA ownership until 1986. For two decades the NBC radio network's roster of stars provided ratings consistently surpassing those of its main competitor, the Columbia Broadcasting System (CBS). But in 1948, as the transition from radio to television was beginning, NBC's leadership came under attack due to what became known as the "Paley raids", named after the president of CBS, William S. Paley. After World War II the tax rate for annual incomes above $70,000 was 77%, while capital gains were taxed at 25%. Paley worked out an accounting technique whereby individual performers could set up corporations that allowed their earnings to be taxed at the significantly lower rate. Instead of NBC responding with a similar package, Sarnoff decided that this accounting method was legally and ethically wrong. NBC's performers did not agree, and most of the top stars, including Amos and Andy, Jack Benny, Red Skelton, Edgar Bergen, Burns and Allen, Ed Wynn, Fred Waring, Al Jolson, Groucho Marx and Frank Sinatra moved from NBC to CBS. As a result, CBS boasted of having sixteen of the twenty top-rated programs in 1949. The consequences would carry over to television, where CBS maintained its newfound dominance for decades. Paley had personally worked to woo the performers, while Sarnoff professed his indifference to the defections, stating at an annual meeting that "Leadership built over the years on a foundation of solid service cannot be snatched overnight by buying a few high-priced comedians. Leadership is not a laughing matter." Following its founding, RCA acted as the sales agent for a small line of Westinghouse and GE branded receivers and parts used by home constructors, originally for a limited market of amateur radio enthusiasts. By 1922, the rise of broadcasting had dramatically increased the demand for radio equipment by the general public, and this development was reflected in the title of RCA's June 1, 1922 catalog, "Radio Enters the Home". RCA began selling receivers under the "Radiola" name, marketing equipment produced by GE and Westinghouse under the production agreement that allocated a 60%–40% ratio in output between the two companies. Although the patent cross-licensing agreements had been intended to give the participants domination of equipment sales, the tremendous growth of the market led to fierce competition, and in 1925 RCA fell behind Atwater Kent as the leader in receiver sales. RCA was particularly hamstrung by the need to coordinate its sales within the limits of the GE/Westinghouse production quotas, and often had difficulty keeping up with industry trends. However, the company made a key advance in early 1924 when it began selling the first superheterodyne receivers, whose high level of performance increased the brand's reputation and popularity. RCA was the exclusive manufacturer of superheterodyne radio sets until 1930. All RCA receivers were battery powered until late 1927 when plug-in AC sets were introduced, providing another boost in sales. Vacuum tubes RCA inherited American Marconi's status as a major producer of vacuum tubes, which were branded Radiotron in the United States. Especially after the rise of broadcasting, they were a major profit source for the company. RCA's strong patent position meant that the company effectively set the selling prices for vacuum tubes in the U.S., which were significantly higher than in Europe, where Lee de Forest had allowed a key patent to lapse. The company began work on a secret project for the U.S. Navy called Madame X in September 1942. The Bloomington, Indiana, plant was one of the first of five RCA plants to produce Madame X vacuum tubes, which included a proximity fuse used to electronically detonate its payload when it was in range of its target, as opposed to relying on a direct hit. James V. Forrestal, former secretary of the Navy said, "The proximity fuse had helped blaze the trail to Japan. Without the protection this ingenious device has given the surface ships of the fleet, our westward push could not have been so swift and the cost in men and ships would have been immeasurably greater." RCA was responsible for creating a series of innovative products, ranging from octal base metal tubes co-developed with General Electric before World War II, to greatly miniaturized Nuvistor tubes, used in the tuners of the New Vista series of television receivers. The Nuvistors were a last major vacuum tube innovation, along with General Electric's Compactron, and were meant to compete with the newly introduced transistor. By 1975, RCA had completely switched from tubes to solid-state devices in their television sets, except for the Cathode-ray tube (CRT) picture tube itself. Phonographs and records The rapid rise of radio broadcasting during the early 1920s, which provided unlimited free entertainment in the home, had a detrimental effect on the American phonograph record industry. The Victor Talking Machine Company in Camden, New Jersey, was then the world's largest manufacturer of records and phonographs, including its popular "Victrola" line of consoles. In January 1929, RCA purchased the Victor Talking Machine Company; this acquisition became known as the RCA Victor Division of the Radio Corporation of America, and included ownership of Victor's Japanese subsidiary, the Victor Company of Japan (JVC), formed in 1927 and controlling interest in The Gramophone Company Ltd. (later EMI Records) in England. RCA's acquisition of the Victor company included the rights to the iconic Nipper/"His Master's Voice" trademark across North America. RCA Victor popularized combined radio receiver-phonographs, and also created RCA Photophone, a movie sound-on-film system that competed with William Fox's sound-on-film Movietone and Warner Bros.' sound-on-disc Vitaphone. Although early announcements of the merger between RCA and Victor stressed that the two firms were linking on equal terms to form a new company, RCA initially had little interest in the phonograph record business. RCA's management was interested essentially in Victor's superior sales capabilities through the record company's large network of authorized distributors and dealers, as well as the extensive, efficient manufacturing facilities in Camden, New Jersey. Immediately following the purchase of Victor, RCA began planning the manufacture of radio sets and components on Victor's Camden assembly lines, while gradually decreasing the production of Victrolas and records. Following the Stock market crash of 1929 and subsequent Great Depression, the entire phonograph record industry in America nearly foundered. During the nadir of the record business in the early 1930s, the manufacture of phonographs and records had all but ceased; extant older phonographs were now obsolete and most had been relegated to the attic or basement. In 1930, RCA Victor began selling the first all-electric Victrola and in 1931, the company attempted to revitalize moribund record sales with the introduction of 331⁄3 revolutions-per-minute (rpm) long play records, which were a commercial failure during the Great Depression, partly because the Victrolas with two speed turntables required to play them were exorbitantly expensive, and also because the audio performance of the new records was generally poor; the new format used the same groove size as existing 78 rpm records, and it would require the smaller-radius stylus of the later microgroove systems to achieve acceptable slower-speed performance. Additionally, the new long-play records were pressed in a pliable, vinyl-based material called "Victrolac" which wore out rapidly under the heavy tonearms then in use. In 1934, following the debacle of its long-play record, RCA Victor introduced the Duo Jr., an inexpensive, small, basic electric turntable designed to be plugged into radio sets. The Duo Jr. was sold at cost, but was discounted even further with the purchase of a certain number of Victor records. The Duo Jr.'s rock-bottom price and America's slowly improving economy helped to overcome the national apathy to phonographs, and record sales gradually began to recover. Around 1935, RCA began marketing the modern RCA Victor M Special, a polished aluminum portable record player designed by John Vassos that has become an icon of 1930s American industrial design. In 1949, RCA Victor released the first 45 rpm "single" records, as a response to Columbia Records successful introduction of its microgroove 331⁄3 rpm "LP" format in 1948. As RCA Victor adopted Columbia's 331⁄3 rpm LP records in 1950, Columbia then adopted RCA Victor's 45 rpm records. In 1965, RCA Victor launched the 8-track tape cartridge. For over a decade, sales of 8-tracks made a huge and profitable impact on consumers of recorded music. Sales of the 8-track format began to decline during the late 1970s as consumer preference turned to the compact cassette tape introduced by Philips in 1963. By 1982, 8-track tapes were no longer being stocked by most music retailers and by 1990, the format had vanished completely. Motion pictures RCA also made investments in the movie industry, but they performed poorly. In April 1928, RCA Photophone, Inc., was organized by a group of companies including RCA to develop sound-movie technology. In the fall of 1927, RCA had purchased stock in Film Booking Office (FBO), and on October 25, 1928, with the help of Joseph P. Kennedy, the Radio-Keith-Orpheum Corporation (RKO) studio was formed by merging FBO with Keith-Albee-Orpheum Corporation (KAO), a company whose holdings included motion picture theaters. The theaters in which RKO had an interest provided a potential market for the RCA Photophone sound systems. RCA ownership of RKO stock expanded from about one quarter in 1930 to about 61% in 1932. RKO encountered severe financial problems, going into receivership from early 1933 until 1940. RCA sold its holdings in the studio to raise funds for its basic operations.[citation needed] Separation from General Electric After years of industry complaints that the cross-licensing agreements between RCA, GE, and Westinghouse had in effect created illegal monopolies, the U.S. Department of Justice brought antitrust charges against the three companies in May 1930. After much negotiation, in 1932 the Justice Department accepted a consent agreement that removed the restrictions established by the cross-licensing agreements, and also provided that RCA would become a fully independent company. As a result, GE and Westinghouse gave up their ownership interests in RCA, while RCA was allowed to keep its factories. To give RCA a chance to establish itself, GE and Westinghouse were required to refrain from competing in the radio business for the next two and one-half years. Television RCA began television development in early 1929, after an overly optimistic Vladimir K. Zworykin convinced Sarnoff that a commercial version of his prototype system could be produced in a relatively short time for $100,000. Following what would actually be many years of additional research and millions of dollars, RCA demonstrated an all-electronic black-and-white television system at the 1939 New York World's Fair. RCA began regular experimental television broadcasting from the NBC studios to the New York metropolitan area on April 30, 1939, via station W2XBS, channel 1 (which evolved into WNBC channel 4) from the new Empire State Building transmitter on top of the structure. Around this time, RCA began selling its first television set models, including the TRK-5 and TRK-9, in various New York stores. However, the FCC had not approved the start of commercial television operations, because technical standards had not yet been finalized. Concerned that RCA's broadcasts were an attempt to flood the market with sets that would force it to adopt RCA's current technology, the FCC stepped in to limit its broadcasts. Following the adoption of National Television System Committee (NTSC) recommended standards, the FCC authorized the start of commercial television broadcasts on July 1, 1941. The entry of the United States into World War II a few months later greatly slowed its deployment, but RCA resumed selling television receivers almost immediately after the war ended in 1945. In 1950, the FCC adopted a standard for color television that had been promoted by CBS, but the effort soon failed, primarily because the color broadcasts could not be received by existing black-and-white sets. As the result of a major research push, RCA engineers developed a method of "compatible" color transmissions that, through the use of interlacing, simultaneously broadcast color and black-and-white images, which could be picked up by both color and existing black-and-white sets. In 1953, RCA's all-electronic color television technology was adopted as the standard for the United States. At that time, Sarnoff predicted annual color television sales would reach 1.78 million in 1956, but the receivers were expensive and difficult to adjust, and there was initially a lack of color programming, so sales lagged badly and the actual 1956 total would only be 120,000. RCA's ownership of NBC proved to be a major benefit, as that network was instructed to promote its color program offerings; even so, it was not until 1968 that color television sales in the United States surpassed those of black-and-white sets. While lauding the technical prowess of his RCA engineers who had developed color television, David Sarnoff, in marked contrast to William Paley, president of CBS, did not disguise his dislike for popular television programs. His authorized biography even boasted that "no one has yet caught him in communion with one of the upper dozen or so top-rated programs" and "The popular programs, to put the matter bluntly, have very little appeal for him." RCA professional video cameras and studio gear, particularly of the TK-40/41 series, became standard equipment at many American television network affiliates, as RCA CT-100 ("RCA Merrill" to dealers) television sets introduced color television to the public. Diversification In 1941, shortly before the United States entered World War II, the cornerstone was laid for a research and development facility in Princeton, New Jersey called RCA Laboratories. Led for many years by Elmer Engstrom, it was used to develop many innovations, including color television, the electron microscope, CMOS-based technology, heterojunction physics, optoelectronic emitting devices, liquid crystal displays (LCDs), videocassette recorders, direct broadcast television, direct broadcast satellite systems and high-definition television. RCA plants switched to war production shortly after the U.S. entered the war in December 1941. During World War II, RCA was involved in radar and radio development in support of the war effort, and ranked 43rd among United States corporations in the value of wartime military production contracts. One such contract was to outfit the battleship USS Texas with a 400-megahertz pulse radar set, using technology developed by RCA acoustics scientist, Irving Wolff. During and after the war, RCA set up several new divisions for defense, space exploration and other activities. The RCA Service Corporation provided large numbers of staff for the Distant Early Warning (DEW) Line. RCA units won five Army–Navy "E" Awards for Excellence in production. Due to the hostilities between Japan and the United States during World War II, the Victor Company of Japan became an independent company after seceding from RCA Victor in the United States; JVC retained the 'Victor' and "His Master's Voice" trademarks for use in Japan only. In 1955, RCA sold its Estate brand of large appliance operations to Whirlpool Corporation. As part of this transaction, Whirlpool was given the right to market "RCA Whirlpool" appliances through the mid-1960s. RCA Graphic Systems Division (GSD) was an early supplier of electronics designed for the printing and publishing industries. It contracted with German company Rudolf Hell to market adaptations of the Digiset photocomposition system as the Videocomp, and a Laser Color Scanner. The Videocomp was supported by a Spectra computer that ran the Page-1 and, later the Page-II and FileComp composition systems. RCA later sold the Videocomp rights to Information International Inc. Computers RCA was one of a number of companies in the 1960s that entered the mainframe computer field to challenge the market leader International Business Machines (IBM). Although at this time computers were almost universally used for routine data processing and scientific research, in 1964 Sarnoff, who prided himself as a visionary, predicted that "The computer will become the hub of a vast network of remote data stations and information banks feeding into the machine at a transmission rate of a billion or more bits of information a second ... Eventually, a global communications network handling voice, data and facsimile will instantly link man to machine—or machine to machine—by land, air, underwater, and space circuits. [The computer] will affect man's ways of thinking, his means of education, his relationship to his physical and social environment, and it will alter his ways of living. ... [Before the end of this century, these forces] will coalesce into what unquestionably will become the greatest adventure of the human mind." RCA marketed a Spectra 70 computer line that was hardware, but not software, compatible with IBM's System/360 series. It also produced the RCA Series, which competed against the IBM System/370. This technology was leased to the English Electric company, which used it for their System 4 series, which were essentially RCA Spectra 70 clones. RCA's TSOS operating system was the first mainframe, demand paging, virtual memory operating system on the market. By 1971, despite a significant investment, RCA had only a 4% market share, and it was estimated that it would cost around $500 million over the next five years to remain competitive with the IBM/370 series. On September 17, 1971, the RCA board of directors announced its decision to close its computer systems division (RCA-CSD), which would be written off as a $490 million company loss. Sperry Rand's UNIVAC division took over the RCA computer division in January 1972. Univac did not want the Spectra computers because they were similar to its own 9000 series; instead, they wanted RCA's computer customer base. Later years On January 1, 1965, Robert Sarnoff succeeded his father as RCA's president, although the elder Sarnoff remained in control as chairman of the board. The younger Sarnoff sought to modernize RCA's image with the introduction in late 1968 of what was then a futuristic-looking new logo (the letters 'RCA' in block, modernized form), replacing the original lightning bolt logo, and the virtual retirement of both the Victor and Nipper/"His Master's Voice" trademarks. The RCA Victor Division was renamed RCA Records; the 'Victor' and 'Victrola' trademarks were no longer used on RCA consumer electronics. 'Victor' was now restricted to the labels and album covers of RCA's regular popular record releases, while the Nipper/"His Master's Voice" trademark was seen only on the album covers of Red Seal records. In 1969, the company name was officially changed from Radio Corporation of America to the "RCA Corporation", to reflect its broader range of corporate activities and expansion into other countries. At the end of that same year, David Sarnoff, after being incapacitated by a long-term illness, was removed as the company's chairman of the board. He died in December 1971. RCA's exit from the mainframe computer market in 1971 marked a milestone in its transition from electronics and technology toward Robert Sarnoff's goal to diversify RCA as a multinational business conglomerate. During the late 1960s and 1970s, the company made a wide-ranging series of acquisitions, including Hertz (rental cars), Banquet (frozen foods and TV dinners), Coronet (carpeting), Random House (publishing) and Gibson (greeting cards). However, the company was slipping into financial disarray, with wags calling it "Rugs Chickens & Automobiles" (RCA), to poke fun at its new direction. During this period, RCA continued to maintain its high standards of engineering excellence in broadcast and satellite communications equipment, but profits generated by the NBC television and radio networks began to decline. Robert Sarnoff's tenure as RCA president was unsuccessful, marked by falling profits. While out of the country in October 1975, Sarnoff was ousted in a "boardroom coup" led by Anthony Conrad, who became RCA's new president. Conrad resigned less than a year later after he admitted failing to file income tax returns for six years. His successor, Edgar H. Griffiths, proved to be unpopular and retired in early 1981. Griffiths was succeeded by Thornton Bradshaw, who turned out to be the last RCA president. After the departure of Robert Sarnoff, Griffiths, who considered the demoted "His Master's Voice" trademark a "valuable company asset", restored Nipper as RCA's corporate mascot. On October 31, 1976, RCA formally announced the return of the Nipper trademark to RCA products and advertising. Earlier that year, RCA Records had begun to reinstate Nipper to most record labels in countries and territories where RCA held the rights to the trademark. Once again, Nipper was widely used in RCA newspaper, magazine, and TV advertisements. The trademark also returned to company stationery, sales literature, shipping cartons, store displays, delivery and service vehicles and reappeared on RCA television sets and in 1981, the new CED Videodisc system. Several newspaper articles and TV news reports about Nipper's revival appeared at the time. A multitude of new Nipper promotional items and collectibles also appeared, including T-shirts, caps, neckties, beach towels, cigarette lighters, coin banks, keychains, watches, clocks, coffee mugs, drinking glasses and stuffed toys. Projects attempting to establish new consumer electronics products during this era failed and lost RCA much money and prestige. An RCA Studio II home video game console, introduced in 1977, was canceled just under two years later due to poor sales. Development of RCA's capacitance electronic (CED) videodisc system began in 1964, and after several years of delays was launched in March 1981. Marketed under the SelectaVision name, the RCA CED videodisc system represented the largest investment RCA made in a single product, even greater than color TV. However, the system was practically obsolete by the time it finally appeared, and never reached the manufacturing volumes needed to bring its price down to the level needed to compete against the newer, recordable and increasingly cheaper videotape technology. In April 1984, after three years of slow sales, RCA abandoned manufacture of the CED players, and ended videodisc production in 1986, after a loss of around $650 million. Around 1980, RCA corporate strategy reported on moving manufacture of its television receivers to Mexico. In 1981, Columbia Pictures sold its share in the home video division to RCA and outside of North America this division was renamed "RCA/Columbia Pictures International Video (now Sony Pictures Home Entertainment)". The following year, within North America, it was renamed to "RCA/Columbia Pictures Home Video". In 1983, the German media conglomerate Bertelsmann sold 50% of Arista Records to RCA Records; in 1985, RCA and Bertelsmann formed a joint venture, RCA/Ariola International, which took over management of RCA Records. Bertelsmann would fully acquire RCA Records from General Electric after GE absorbed RCA in 1986. RCA was still profitable in 1983, when it switched manufacturing of its VHS VCRs from Panasonic to Hitachi. In 1984, RCA Broadcast Systems Division moved from the RCA Victor plant in Camden, New Jersey, to the site of the RCA antenna engineering facility in Gibbsboro, New Jersey. On October 3, 1985, RCA announced it was closing the Broadcast Systems Division. In the years that followed, the broadcast product lines developed in Camden were terminated or sold off, and most of the old RCA Victor buildings and factories in Camden were demolished, except for a few of the original Victor buildings that had been declared national historic buildings. For several years, RCA spinoff L-3 Communications Systems East was headquartered in the famous Nipper Building, but has since moved to an adjacent building built by the city for them. The renovated Nipper Building now houses shops and luxury loft apartments. Also in 1985, RCA sold the Hertz car rental company to UAL, Inc. Re-acquisition and breakup by General Electric In December 1985, it was announced that General Electric would reacquire its former subsidiary for $6.28 billion in cash, or $66.50 per share of stock. GE's acquisition of RCA was the largest non-oil company merger in history up to that time and was completed on June 9, 1986. Despite initial assurances that the combined forces of GE and RCA would create a technology, manufacturing and entertainment behemoth, over the next few years, GE proceeded to sell off most of RCA's assets. It was revealed that GE's main motivation for purchasing RCA was to acquire the NBC Television Network and the corporation's defense related businesses. In 1987, GE disposed of its 50% interest in RCA Records to its German partner Bertelsmann and RCA Records became a division of Bertelsmann Music Group. RCA Global Communications Inc., a division with roots dating back to RCA's founding in 1919, was sold to the MCI Communications Corporation; also in 1987, Westwood One purchased the NBC Radio Network. In 1988, the rights to manufacture consumer electronics products under the RCA and GE brands was acquired by Thomson Consumer Electronics, in exchange for some of Thomson's medical businesses. Also in 1988, its semiconductor business (including the former RCA Solid State unit and Intersil) was bought by Harris Corporation. That same year, the iconic RCA Building, known as "30 Rock" at Rockefeller Center was renamed the GE Building. In 1991, GE sold its share in RCA/Columbia to Sony Pictures which renamed the unit "Columbia TriStar Home Video" (later further renamed to Columbia TriStar Home Entertainment, now Sony Pictures Home Entertainment). Sarnoff Labs was put on a five-year plan whereby GE would fund all the labs' activities for the first year, then reduce its support to near zero after the fifth year. This required Sarnoff Labs to change its business model to become an industrial contract research facility. In 1988, it was transferred to SRI International (SRI) as the David Sarnoff Research Center, and subsequently renamed the Sarnoff Corporation. In January 2011, Sarnoff Corporation was fully integrated into SRI. In 2011, GE sold its controlling interest in NBC, by this time part of the multimedia NBCUniversal venture that included TV and cable, to Comcast, and in 2013, Comcast acquired the remaining interest. After the sale of NBCUniversal, the only former RCA unit which GE retained was Government Services. In 2022, Thomson's successor company, Technicolor SA, sold the RCA trademarks to licensing firm Talisman Brands, Inc. d/b/a Established Incorporated, stylized as established. Legacy RCA antique radios, and early color television receivers such as the RCA Merrill/CT-100, are among the more sought-after collectible radios and televisions, due to their popularity during the golden age of radio and the historic significance of the RCA name, as well as their styling, manufacturing quality and engineering innovations. Most collectable are the pre-war television sets manufactured by RCA beginning in 1939, including the TRK-5, TRK-9 and TRK-12 models. The RCA Heritage Museum was established at Rowan University in 2012. The historic RCA Victor Building 17, the "Nipper Building", in Camden, New Jersey, was converted to luxury apartments in 2003. A type of plug/jack combination used in audio and video cables is still called the RCA connector. To this day, a variety of consumer electronics including 2-in-1 tablets, televisions and telephones, home appliances and more are sold under the RCA brand name. RCA Records continues as a flagship label of Sony Music Entertainment. Numerous former RCA manufacturing sites have been reported to be polluted with industrial waste. Photo gallery Leadership See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/PeopleCode] | [TOKENS: 282]
Contents PeopleCode PeopleCode is a proprietary object-oriented programming language used to express business logic for PeopleSoft applications. Syntactically, PeopleCode is similar to other programming languages, and can be found in both loosely-typed and strongly-typed forms. PeopleCode and its run-time environment is part of the larger PeopleTools framework. PeopleCode has evolved over time and its implementation through the PeopleSoft applications lack consistency. PeopleCode offers some interoperability with the Java programming language. Definition name references, for example, enable you to refer to PeopleTools definitions, such as record definitions or pages, without using hard-coded string literals. Other language features, such as PeopleCode data types and metastrings, reflect the close interaction of PeopleTools and Structured Query Language (SQL). Dot notation, classes and methods in PeopleCode are similar to other object oriented languages, like Java. Object syntax was an important feature of PeopleTools 8. Language features PeopleCode supports the following types of functions: In addition, PeopleCode supports methods. The main differences between a built-in function and a method are: The values for the bind variables can be omitted and supplied later. For Insert, Update, or Delete commands these values would be supplied using Execute method. (If all the necessary input values are supplied, the SQL is executed immediately.) PeopleCode Functions, retrieved 2008-12-14 See also External links References
========================================
[SOURCE: https://en.wikipedia.org/wiki/File:PSone-Motherboard.jpg] | [TOKENS: 94]
File:PSone-Motherboard.jpg Summary File history Click on a date/time to view the file as it appeared at that time. File usage The following 2 pages use this file: Global file usage The following other wikis use this file: Metadata This file contains additional information, probably added from the digital camera or scanner used to create or digitize it. If the file has been modified from its original state, some details may not fully reflect the modified file.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Meta_Platforms#cite_note-17] | [TOKENS: 8626]
Contents Meta Platforms Meta Platforms, Inc. (doing business as Meta) is an American multinational technology company headquartered in Menlo Park, California. Meta owns and operates several prominent social media platforms and communication services, including Facebook, Instagram, WhatsApp, Messenger, Threads and Manus. The company also operates an advertising network for its own sites and third parties; as of 2023[update], advertising accounted for 97.8 percent of its total revenue. Meta has been described as a part of Big Tech, which refers to the largest six tech companies in the United States, Alphabet (Google), Amazon, Apple, Meta (Facebook), Microsoft, and Nvidia, which are also the largest companies in the world by market capitalization. The company was originally established in 2004 as TheFacebook, Inc., and was renamed Facebook, Inc. in 2005. In 2021, it rebranded as Meta Platforms, Inc. to reflect a strategic shift toward developing the metaverse—an interconnected digital ecosystem spanning virtual and augmented reality technologies. In 2023, Meta was ranked 31st on the Forbes Global 2000 list of the world's largest public companies. As of 2022, it was the world's third-largest spender on research and development, with R&D expenses totaling US$35.3 billion. History Facebook filed for an initial public offering (IPO) on January 1, 2012. The preliminary prospectus stated that the company sought to raise $5 billion, had 845 million monthly active users, and a website accruing 2.7 billion likes and comments daily. After the IPO, Zuckerberg would retain 22% of the total shares and 57% of the total voting power in Facebook. Underwriters valued the shares at $38 each, valuing the company at $104 billion, the largest valuation yet for a newly public company. On May 16, one day before the IPO, Facebook announced it would sell 25% more shares than originally planned due to high demand. The IPO raised $16 billion, making it the third-largest in US history (slightly ahead of AT&T Mobility and behind only General Motors and Visa). The stock price left the company with a higher market capitalization than all but a few U.S. corporations—surpassing heavyweights such as Amazon, McDonald's, Disney, and Kraft Foods—and made Zuckerberg's stock worth $19 billion. The New York Times stated that the offering overcame questions about Facebook's difficulties in attracting advertisers to transform the company into a "must-own stock". Jimmy Lee of JPMorgan Chase described it as "the next great blue-chip". Writers at TechCrunch, on the other hand, expressed skepticism, stating, "That's a big multiple to live up to, and Facebook will likely need to add bold new revenue streams to justify the mammoth valuation." Trading in the stock, which began on May 18, was delayed that day due to technical problems with the Nasdaq exchange. The stock struggled to stay above the IPO price for most of the day, forcing underwriters to buy back shares to support the price. At the closing bell, shares were valued at $38.23, only $0.23 above the IPO price and down $3.82 from the opening bell value. The opening was widely described by the financial press as a disappointment. The stock set a new record for trading volume of an IPO. On May 25, 2012, the stock ended its first full week of trading at $31.91, a 16.5% decline. On May 22, 2012, regulators from Wall Street's Financial Industry Regulatory Authority announced that they had begun to investigate whether banks underwriting Facebook had improperly shared information only with select clients rather than the general public. Massachusetts Secretary of State William F. Galvin subpoenaed Morgan Stanley over the same issue. The allegations sparked "fury" among some investors and led to the immediate filing of several lawsuits, one of them a class action suit claiming more than $2.5 billion in losses due to the IPO. Bloomberg estimated that retail investors may have lost approximately $630 million on Facebook stock since its debut. S&P Global Ratings added Facebook to its S&P 500 index on December 21, 2013. On May 2, 2014, Zuckerberg announced that the company would be changing its internal motto from "Move fast and break things" to "Move fast with stable infrastructure". The earlier motto had been described as Zuckerberg's "prime directive to his developers and team" in a 2009 interview in Business Insider, in which he also said, "Unless you are breaking stuff, you are not moving fast enough." In November 2016, Facebook announced the Microsoft Windows client of gaming service Facebook Gameroom, formerly Facebook Games Arcade, at the Unity Technologies developers conference. The client allows Facebook users to play "native" games in addition to its web games. The service was closed in June 2021. Lasso was a short-video sharing app from Facebook similar to TikTok that was launched on iOS and Android in 2018 and was aimed at teenagers. On July 2, 2020, Facebook announced that Lasso would be shutting down on July 10. In 2018, the Oculus lead Jason Rubin sent his 50-page vision document titled "The Metaverse" to Facebook's leadership. In the document, Rubin acknowledged that Facebook's virtual reality business had not caught on as expected, despite the hundreds of millions of dollars spent on content for early adopters. He also urged the company to execute fast and invest heavily in the vision, to shut out HTC, Apple, Google and other competitors in the VR space. Regarding other players' participation in the metaverse vision, he called for the company to build the "metaverse" to prevent their competitors from "being in the VR business in a meaningful way at all". In May 2019, Facebook founded Libra Networks, reportedly to develop their own stablecoin cryptocurrency. Later, it was reported that Libra was being supported by financial companies such as Visa, Mastercard, PayPal and Uber. The consortium of companies was expected to pool in $10 million each to fund the launch of the cryptocurrency coin named Libra. Depending on when it would receive approval from the Swiss Financial Market Supervisory authority to operate as a payments service, the Libra Association had planned to launch a limited format cryptocurrency in 2021. Libra was renamed Diem, before being shut down and sold in January 2022 after backlash from Swiss government regulators and the public. During the COVID-19 pandemic, the use of online services, including Facebook, grew globally. Zuckerberg predicted this would be a "permanent acceleration" that would continue after the pandemic. Facebook hired aggressively, growing from 48,268 employees in March 2020 to more than 87,000 by September 2022. Following a period of intense scrutiny and damaging whistleblower leaks, news started to emerge on October 21, 2021 about Facebook's plan to rebrand the company and change its name. In the Q3 2021 earnings call on October 25, Mark Zuckerberg discussed the ongoing criticism of the company's social services and the way it operates, and pointed to the pivoting efforts to building the metaverse – without mentioning the rebranding and the name change. The metaverse vision and the name change from Facebook, Inc. to Meta Platforms was introduced at Facebook Connect on October 28, 2021. Based on Facebook's PR campaign, the name change reflects the company's shifting long term focus of building the metaverse, a digital extension of the physical world by social media, virtual reality and augmented reality features. "Meta" had been registered as a trademark in the United States in 2018 (after an initial filing in 2015) for marketing, advertising, and computer services, by a Canadian company that provided big data analysis of scientific literature. This company was acquired in 2017 by the Chan Zuckerberg Initiative (CZI), a foundation established by Zuckerberg and his wife, Priscilla Chan, and became one of their projects. Following the rebranding announcement, CZI announced that it had already decided to deprioritize the earlier Meta project, thus it would be transferring its rights to the name to Meta Platforms, and the previous project would end in 2022. Soon after the rebranding, in early February 2022, Meta reported a greater-than-expected decline in profits in the fourth quarter of 2021. It reported no growth in monthly users, and indicated it expected revenue growth to stall. It also expected measures taken by Apple Inc. to protect user privacy to cost it some $10 billion in advertisement revenue, an amount equal to roughly 8% of its revenue for 2021. In meeting with Meta staff the day after earnings were reported, Zuckerberg blamed competition for user attention, particularly from video-based apps such as TikTok. The 27% reduction in the company's share price which occurred in reaction to the news eliminated some $230 billion of value from Meta's market capitalization. Bloomberg described the decline as "an epic rout that, in its sheer scale, is unlike anything Wall Street or Silicon Valley has ever seen". Zuckerberg's net worth fell by as much as $31 billion. Zuckerberg owns 13% of Meta, and the holding makes up the bulk of his wealth. According to published reports by Bloomberg on March 30, 2022, Meta turned over data such as phone numbers, physical addresses, and IP addresses to hackers posing as law enforcement officials using forged documents. The law enforcement requests sometimes included forged signatures of real or fictional officials. When asked about the allegations, a Meta representative said, "We review every data request for legal sufficiency and use advanced systems and processes to validate law enforcement requests and detect abuse." In June 2022, Sheryl Sandberg, the chief operating officer of 14 years, announced she would step down that year. Zuckerberg said that Javier Olivan would replace Sandberg, though in a “more traditional” role. In March 2022, Meta (except Meta-owned WhatsApp) and Instagram were banned in Russia and added to the Russian list of terrorist and extremist organizations for alleged Russophobia and hate speech (up to genocidal calls) amid the ongoing Russian invasion of Ukraine. Meta appealed against the ban, but it was upheld by a Moscow court in June of the same year. Also in March 2022, Meta and Italian eyewear giant Luxottica released Ray-Ban Stories, a series of smartglasses which could play music and take pictures. Meta and Luxottica parent company EssilorLuxottica declined to disclose sales on the line of products as of September 2022, though Meta has expressed satisfaction with its customer feedback. In July 2022, Meta saw its first year-on-year revenue decline when its total revenue slipped by 1% to $28.8bn. Analysts and journalists accredited the loss to its advertising business, which has been limited by Apple's app tracking transparency feature and the number of people who have opted not to be tracked by Meta apps. Zuckerberg also accredited the decline to increasing competition from TikTok. On October 27, 2022, Meta's market value dropped to $268 billion, a loss of around $700 billion compared to 2021, and its shares fell by 24%. It lost its spot among the top 20 US companies by market cap, despite reaching the top 5 in the previous year. In November 2022, Meta laid off 11,000 employees, 13% of its workforce. Zuckerberg said the decision to aggressively increase Meta's investments had been a mistake, as he had wrongly predicted that the surge in e-commerce would last beyond the COVID-19 pandemic. He also attributed the decline to increased competition, a global economic downturn and "ads signal loss". Plans to lay off a further 10,000 employees began in April 2023. The layoffs were part of a general downturn in the technology industry, alongside layoffs by companies including Google, Amazon, Tesla, Snap, Twitter and Lyft. Starting from 2022, Meta scrambled to catch up to other tech companies in adopting specialized artificial intelligence hardware and software. It had been using less expensive CPUs instead of GPUs for AI work, but that approach turned out to be less efficient. The company gifted the Inter-university Consortium for Political and Social Research $1.3 million to finance the Social Media Archive's aim to make their data available to social science research. In 2023, Ireland's Data Protection Commissioner imposed a record EUR 1.2 billion fine on Meta for transferring data from Europe to the United States without adequate protections for EU citizens.: 250 In March 2023, Meta announced a new round of layoffs that would cut 10,000 employees and close 5,000 open positions to make the company more efficient. Meta revenue surpassed analyst expectations for the first quarter of 2023 after announcing that it was increasing its focus on AI. On July 6, Meta launched a new app, Threads, a competitor to Twitter. Meta announced its artificial intelligence model Llama 2 in July 2023, available for commercial use via partnerships with major cloud providers like Microsoft. It was the first project to be unveiled out of Meta's generative AI group after it was set up in February. It would not charge access or usage but instead operate with an open-source model to allow Meta to ascertain what improvements need to be made. Prior to this announcement, Meta said it had no plans to release Llama 2 for commercial use. An earlier version of Llama was released to academics. In August 2023, Meta announced its permanent removal of news content from Facebook and Instagram in Canada due to the Online News Act, which requires Canadian news outlets to be compensated for content shared on its platform. The Online News Act was in effect by year-end, but Meta will not participate in the regulatory process. In October 2023, Zuckerberg said that AI would be Meta's biggest investment area in 2024. Meta finished 2023 as one of the best-performing technology stocks of the year, with its share price up 150 percent. Its stock reached an all-time high in January 2024, bringing Meta within 2% of achieving $1 trillion market capitalization. In November 2023 Meta Platforms launched an ad-free service in Europe, allowing subscribers to opt-out of personal data being collected for targeted advertising. A group of 28 European organizations, including Max Schrems' advocacy group NOYB, the Irish Council for Civil Liberties, Wikimedia Europe, and the Electronic Privacy Information Center, signed a 2024 letter to the European Data Protection Board (EDPB) expressing concern that this subscriber model would undermine privacy protections, specifically GDPR data protection standards. Meta removed the Facebook and Instagram accounts of Iran's Supreme Leader Ali Khamenei in February 2024, citing repeated violations of its Dangerous Organizations & Individuals policy. As of March, Meta was under investigation by the FDA for alleged use of their social media platforms to sell illegal drugs. On 16 May 2024, the European Commission began an investigation into Meta over concerns related to child safety. In May 2023, Iraqi social media influencer Esaa Ahmed-Adnan encountered a troubling issue when Instagram removed his posts, citing false copyright violations despite his content being original and free from copyrighted material. He discovered that extortionists were behind these takedowns, offering to restore his content for $3,000 or provide ongoing protection for $1,000 per month. This scam, exploiting Meta’s rights management tools, became widespread in the Middle East, revealing a gap in Meta’s enforcement in developing regions. An Iraqi nonprofit Tech4Peace’s founder, Aws al-Saadi helped Ahmed-Adnan and others, but the restoration process was slow, leading to significant financial losses for many victims, including prominent figures like Ammar al-Hakim. This situation highlighted Meta’s challenges in balancing global growth with effective content moderation and protection. On 16 September 2024, Meta announced it had banned Russian state media outlets from its platforms worldwide due to concerns about "foreign interference activity." This decision followed allegations that RT and its employees funneled $10 million through shell companies to secretly fund influence campaigns on various social media channels. Meta's actions were part of a broader effort to counter Russian covert influence operations, which had intensified since the invasion. At its 2024 Connect conference, Meta presented Orion, its first pair of augmented reality glasses. Though Orion was originally intended to be sold to consumers, the manufacturing process turned out to be too complex and expensive. Instead, the company pivoted to producing a small number of the glasses to be used internally. On 4 October 2024, Meta announced about its new AI model called Movie Gen, capable of generating realistic video and audio clips based on user prompts. Meta stated it would not release Movie Gen for open development, preferring to collaborate directly with content creators and integrate it into its products by the following year. The model was built using a combination of licensed and publicly available datasets. On October 31, 2024, ProPublica published an investigation into deceptive political advertisement scams that sometimes use hundreds of hijacked profiles and facebook pages run by organized networks of scammers. The authors cited spotty enforcement by Meta as a major reason for the extent of the issue. In November 2024, TechCrunch reported that Meta were considering building a $10bn global underwater cable spanning 25,000 miles. In the same month, Meta closed down 2 million accounts on Facebook and Instagram that were linked to scam centers in Myanmar, Laos, Cambodia, the Philippines, and the United Arab Emirates doing pig butchering scams. In December 2024, Meta announced that, beginning February 2025, they would require advertisers to run ads about financial services in Australia to verify information about who are the beneficiary and the payer in a bid to regulate scams. On December 4, 2024, Meta announced it will invest US$10 billion for its largest AI data center in northeast Louisiana, powered by natural gas facilities. On the 11th of that month, Meta experienced a global outage, impacting accounts on all of their social media and messaging applications. Outage reports from DownDetector reached 70,000+ and 100,000+ within minutes for Instagram and Facebook, respectively. In January 2025, Meta announced plans to roll back its diversity, equity, and inclusion (DEI) initiatives, citing shifts in the "legal and policy landscape" in the United States following the 2024 presidential election. The decision followed reports that CEO Mark Zuckerberg sought to align the company more closely with the incoming Trump administration, including changes to content moderation policies and executive leadership. The new content moderation policies continued to bar insults about a person's intellect or mental illness, but made an exception to allow calling LGBTQ people mentally ill because they are gay or transgender. Later that month, Meta agreed to pay $25 million to settle a 2021 lawsuit brought by Donald Trump for suspending his social media accounts after the January 6 riots. Changes to Meta's moderation policies were controversial among its oversight board, with a significant divide in opinion between the board's US conservatives and its global members. In June 2025, Meta Platforms Inc. has decided to make a multibillion-dollar investment into artificial intelligence startup Scale AI. The financing could exceed $10 billion in value which would make it one of the largest private company funding events of all time. In October 2025, it was announced that Meta would be laying off 600 employees in the artificial intelligence unit to perform better and simpler. They referred to their AI unit as "bloated" and are seeking to trim down the department. This mass layoff is going to impact Meta’s AI infrastructure units, Fundamental Artificial Intelligence Research unit (FAIR) and other product-related positions. Mergers and acquisitions Meta has acquired multiple companies (often identified as talent acquisitions). One of its first major acquisitions was in April 2012, when it acquired Instagram for approximately US$1 billion in cash and stock. In October 2013, Facebook, Inc. acquired Onavo, an Israeli mobile web analytics company. In February 2014, Facebook, Inc. announced it would buy mobile messaging company WhatsApp for US$19 billion in cash and stock. The acquisition was completed on October 6. Later that year, Facebook bought Oculus VR for $2.3 billion in cash and stock, which released its first consumer virtual reality headset in 2016. In late November 2019, Facebook, Inc. announced the acquisition of the game developer Beat Games, responsible for developing one of that year's most popular VR games, Beat Saber. In Late 2022, after Facebook Inc rebranded to Meta Platforms Inc, Oculus was rebranded to Meta Quest. In May 2020, Facebook, Inc. announced it had acquired Giphy for a reported cash price of $400 million. It will be integrated with the Instagram team. However, in August 2021, UK's Competition and Markets Authority (CMA) stated that Facebook, Inc. might have to sell Giphy, after an investigation found that the deal between the two companies would harm competition in display advertising market. Facebook, Inc. was fined $70 million by CMA for deliberately failing to report all information regarding the acquisition and the ongoing antitrust investigation. In October 2022, the CMA ruled for a second time that Meta be required to divest Giphy, stating that Meta already controls half of the advertising in the UK. Meta agreed to the sale, though it stated that it disagrees with the decision itself. In May 2023, Giphy was divested to Shutterstock for $53 million. In November 2020, Facebook, Inc. announced that it planned to purchase the customer-service platform and chatbot specialist startup Kustomer to promote companies to use their platform for business. It has been reported that Kustomer valued at slightly over $1 billion. The deal was closed in February 2022 after regulatory approval. In September 2022, Meta acquired Lofelt, a Berlin-based haptic tech startup. In December 2025, it was announced Meta had acquired the AI-wearables startup, Limitless. In the same month, they also acquired another AI startup, Manus AI, for $2 billion. Manus announced in December that its platform had achieved $100mm in recurring revenue just 8 months after its launch and Meta said it will scale the platform to many other businesses. In January 2026, it was announced Meta proposed acquisition of Manus was undergoing preliminary scrutiny by Chinese regulators. The examination concerns the cross-border transfer of artificial intelligence technology developed in China. Lobbying In 2020, Facebook, Inc. spent $19.7 million on lobbying, hiring 79 lobbyists. In 2019, it had spent $16.7 million on lobbying and had a team of 71 lobbyists, up from $12.6 million and 51 lobbyists in 2018. Facebook was the largest spender of lobbying money among the Big Tech companies in 2020. The lobbying team includes top congressional aide John Branscome, who was hired in September 2021, to help the company fend off threats from Democratic lawmakers and the Biden administration. In December 2024, Meta donated $1 million to the inauguration fund for then-President-elect Donald Trump. In 2025, Meta was listed among the donors funding the construction of the White House State Ballroom. Partnerships February 2026, Meta announced a long-term partnership with Nvidia. Censorship In August 2024, Mark Zuckerberg sent a letter to Jim Jordan indicating that during the COVID-19 pandemic the Biden administration repeatedly asked Meta to limit certain COVID-19 content, including humor and satire, on Facebook and Instagram. In 2016 Meta hired Jordana Cutler, formerly an employee at the Israeli Embassy to the United States, as its policy chief for Israel and the Jewish Diaspora. In this role, Cutler pushed for the censorship of accounts belonging to Students for Justice in Palestine chapters in the United States. Critics have said that Cutler's position gives the Israeli government an undue influence over Meta policy, and that few countries have such high levels of contact with Meta policymakers. Following the election of Donald Trump in 2025, various sources noted possible censorship related to the Democratic Party on Instagram and other Meta platforms. In February 2025, a Meta rep flagged journalist Gil Duran's article and other "critiques of tech industry figures" as spam or sensitive content, limiting their reach. In March 2025, Meta attempted to block former employee Sarah Wynn-Williams from promoting or further distributing her memoir, Careless People, that includes allegations of unaddressed sexual harassment in the workplace by senior executives. The New York Times reports that the arbitration is among Meta's most forcible attempts to repudiate a former employee's account of workplace dynamics. Publisher Macmillan reacted to the ruling by the Emergency International Arbitral Tribunal by stating that it will ignore its provisions. As of 15 March 2025[update], hardback and digital versions of Careless People were being offered for sale by major online retailers. From October 2025, Meta began removing and restricting access for accounts related to LGBTQ, reproductive health and abortion information pages on its platforms. Martha Dimitratou, executive director of Repro Uncensored, called Meta's shadow-banning of these issues "One of the biggest waves of censorship we are seeing". Disinformation concerns Since its inception, Meta has been accused of being a host for fake news and misinformation. In the wake of the 2016 United States presidential election, Zuckerberg began to take steps to eliminate the prevalence of fake news, as the platform had been criticized for its potential influence on the outcome of the election. The company initially partnered with ABC News, the Associated Press, FactCheck.org, Snopes and PolitiFact for its fact-checking initiative; as of 2018, it had over 40 fact-checking partners across the world, including The Weekly Standard. A May 2017 review by The Guardian found that the platform's fact-checking initiatives of partnering with third-party fact-checkers and publicly flagging fake news were regularly ineffective, and appeared to be having minimal impact in some cases. In 2018, journalists working as fact-checkers for the company criticized the partnership, stating that it had produced minimal results and that the company had ignored their concerns. In 2024 Meta's decision to continue to disseminate a falsified video of US president Joe Biden, even after it had been proven to be fake, attracted criticism and concern. In January 2025, Meta ended its use of third-party fact-checkers in favor of a user-run community notes system similar to the one used on X. While Zuckerberg supported these changes, saying that the amount of censorship on the platform was excessive, the decision received criticism by fact-checking institutions, stating that the changes would make it more difficult for users to identify misinformation. Meta also faced criticism for weakening its policies on hate speech that were designed to protect minorities and LGBTQ+ individuals from bullying and discrimination. While moving its content review teams from California to Texas, Meta changed their hateful conduct policy to eliminate restrictions on anti-LGBT and anti-immigrant hate speech, as well as explicitly allowing users to accuse LGBT people of being mentally ill or abnormal based on their sexual orientation or gender identity. In January 2025, Meta faced significant criticism for its role in removing LGBTQ+ content from its platforms, amid its broader efforts to address anti-LGBTQ+ hate speech. The removal of LGBTQ+ themes was noted as part of the wider crackdown on content deemed to violate its community guidelines. Meta's content moderation policies, which were designed to combat harmful speech and protect users from discrimination, inadvertently led to the removal or restriction of LGBTQ+ content, particularly posts highlighting LGBTQ+ identities, support, or political issues. According to reports, LGBTQ+ posts, including those that simply celebrated pride or advocated for LGBTQ+ rights, were flagged and removed for reasons that some critics argue were vague or inconsistently applied. Many LGBTQ+ activists and users on Meta's platforms expressed concern that such actions stifled visibility and expression, potentially isolating LGBTQ+ individuals and communities, especially in spaces that were historically important for outreach and support. Lawsuits Numerous lawsuits have been filed against the company, both when it was known as Facebook, Inc., and as Meta Platforms. In March 2020, the Office of the Australian Information Commissioner (OAIC) sued Facebook, for significant and persistent infringements of the rule on privacy involving the Cambridge Analytica fiasco. Every violation of the Privacy Act is subject to a theoretical cumulative liability of $1.7 million. The OAIC estimated that a total of 311,127 Australians had been exposed. On December 8, 2020, the U.S. Federal Trade Commission and 46 states (excluding Alabama, Georgia, South Carolina, and South Dakota), the District of Columbia and the territory of Guam, launched Federal Trade Commission v. Facebook as an antitrust lawsuit against Facebook. The lawsuit concerns Facebook's acquisition of two competitors—Instagram and WhatsApp—and the ensuing monopolistic situation. FTC alleges that Facebook holds monopolistic power in the U.S. social networking market and seeks to force the company to divest from Instagram and WhatsApp to break up the conglomerate. William Kovacic, a former chairman of the Federal Trade Commission, argued the case will be difficult to win as it would require the government to create a counterfactual argument of an internet where the Facebook-WhatsApp-Instagram entity did not exist, and prove that harmed competition or consumers. In November 2025, it was ruled that Meta did not violate antitrust laws and holds no monopoly in the market. On December 24, 2021, a court in Russia fined Meta for $27 million after the company declined to remove unspecified banned content. The fine was reportedly tied to the company's annual revenue in the country. In May 2022, a lawsuit was filed in Kenya against Meta and its local outsourcing company Sama. Allegedly, Meta has poor working conditions in Kenya for workers moderating Facebook posts. According to the lawsuit, 260 screeners were declared redundant with confusing reasoning. The lawsuit seeks financial compensation and an order that outsourced moderators be given the same health benefits and pay scale as Meta employees. In June 2022, 8 lawsuits were filed across the U.S. over the allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues. The litigation follows a former Facebook employee's testimony in Congress that the company refused to take responsibility. The company noted that tools have been developed for parents to keep track of their children's activity on Instagram and set time limits, in addition to Meta's "Take a break" reminders. In addition, the company is providing resources specific to eating disorders as well as developing AI to prevent children under the age of 13 signing up for Facebook or Instagram. In June 2022, Meta settled a lawsuit with the US Department of Justice. The lawsuit, which was filed in 2019, alleged that the company enabled housing discrimination through targeted advertising, as it allowed homeowners and landlords to run housing ads excluding people based on sex, race, religion, and other characteristics. The U.S. Department of Justice stated that this was in violation of the Fair Housing Act. Meta was handed a penalty of $115,054 and given until December 31, 2022, to shadow the algorithm tool. In January 2023, Meta was fined €390 million for violations of the European Union General Data Protection Regulation. In May 2023, the European Data Protection Board fined Meta a record €1.2 billion for breaching European Union data privacy laws by transferring personal data of Facebook users to servers in the U.S. In July 2024, Meta agreed to pay the state of Texas US$1.4 billion to settle a lawsuit brought by Texas Attorney General Ken Paxton accusing the company of collecting users' biometric data without consent, setting a record for the largest privacy-related settlement ever obtained by a state attorney general. In October 2024, Meta Platforms faced lawsuits in Japan from 30 plaintiffs who claimed they were defrauded by fake investment ads on Facebook and Instagram, featuring false celebrity endorsements. The plaintiffs are seeking approximately $2.8 million in damages. In April 2025, the Kenyan High Court ruled that a US$2.4 billion lawsuit in which three plaintiffs claim that Facebook inflamed civil violence in Ethiopia in 2021 could proceed. In April 2025, Meta was fined €200 million ($230 million) for breaking the Digital Markets Act, by imposing a “consent or pay” system that forces users to either allow their personal data to be used to target advertisements, or pay a subscription fee for advertising-free versions of Facebook and Instagram. In late April 2025, a case was filed against Meta in Ghana over the alleged psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse. Meta moved the moderation service to the Ghanaian capital of Accra after legal issues in the previous location Kenya. The new moderation company is Teleperformance, a multinational corporation with a history of worker's rights violation. Reports suggests the conditions are worse here than in the previous Kenyan location, with many workers afraid of speaking out due to fear of returning to conflict zones. Workers reported developing mental illnesses, attempted suicides, and low pay. In 26 January 2026, a New Mexico state court case was filed, suggesting that Mark Zuckerberg approved allowing minors to access artificial intelligence chatbot companions that safety staffers warned were capable of sexual interactions. In 2020, the company UReputation, which had been involved in several cases concerning the management of digital armies[clarification needed], filed a lawsuit against Facebook, accusing it of unlawfully transmitting personal data to third parties. Legal actions were initiated in Tunisia, France, and the United States. In 2025, the United States District court for the Northern District of Georgia approved a discovery procedure, allowing UReputation to access documents and evidence held by Meta. Structure Meta's key management consists of: As of October 2022[update], Meta had 83,553 employees worldwide. As of June 2024[update], Meta's board consisted of the following directors; Meta Platforms is mainly owned by institutional investors, who hold around 80% of all shares. Insiders control the majority of voting shares. The three largest individual investors in 2024 were Mark Zuckerberg, Sheryl Sandberg and Christopher K. Cox. The largest shareholders in late 2024/early 2025 were: Roger McNamee, an early Facebook investor and Zuckerberg's former mentor, said Facebook had "the most centralized decision-making structure I have ever encountered in a large company". Facebook co-founder Chris Hughes has stated that chief executive officer Mark Zuckerberg has too much power, that the company is now a monopoly, and that, as a result, it should be split into multiple smaller companies. In an op-ed in The New York Times, Hughes said he was concerned that Zuckerberg had surrounded himself with a team that did not challenge him, and that it is the U.S. government's job to hold him accountable and curb his "unchecked power". He also said that "Mark's power is unprecedented and un-American." Several U.S. politicians agreed with Hughes. European Union Commissioner for Competition Margrethe Vestager stated that splitting Facebook should be done only as "a remedy of the very last resort", and that it would not solve Facebook's underlying problems. Revenue Facebook ranked No. 34 in the 2020 Fortune 500 list of the largest United States corporations by revenue, with almost $86 billion in revenue most of it coming from advertising. One analysis of 2017 data determined that the company earned US$20.21 per user from advertising. According to New York, since its rebranding, Meta has reportedly lost $500 billion as a result of new privacy measures put in place by companies such as Apple and Google which prevents Meta from gathering users' data. In February 2015, Facebook announced it had reached two million active advertisers, with most of the gain coming from small businesses. An active advertiser was defined as an entity that had advertised on the Facebook platform in the last 28 days. In March 2016, Facebook announced it had reached three million active advertisers with more than 70% from outside the United States. Prices for advertising follow a variable pricing model based on auctioning ad placements, and potential engagement levels of the advertisement itself. Similar to other online advertising platforms like Google and Twitter, targeting of advertisements is one of the chief merits of digital advertising compared to traditional media. Marketing on Meta is employed through two methods based on the viewing habits, likes and shares, and purchasing data of the audience, namely targeted audiences and "look alike" audiences. The U.S. IRS challenged the valuation Facebook used when it transferred IP from the U.S. to Facebook Ireland (now Meta Platforms Ireland) in 2010 (which Facebook Ireland then revalued higher before charging out), as it was building its double Irish tax structure. The case is ongoing and Meta faces a potential fine of $3–5bn. The U.S. Tax Cuts and Jobs Act of 2017 changed Facebook's global tax calculations. Meta Platforms Ireland is subject to the U.S. GILTI tax of 10.5% on global intangible profits (i.e. Irish profits). On the basis that Meta Platforms Ireland Limited is paying some tax, the effective minimum US tax for Facebook Ireland will be circa 11%. In contrast, Meta Platforms Inc. would incur a special IP tax rate of 13.125% (the FDII rate) if its Irish business relocated to the U.S. Tax relief in the U.S. (21% vs. Irish at the GILTI rate) and accelerated capital expensing, would make this effective U.S. rate around 12%. The insignificance of the U.S./Irish tax difference was demonstrated when Facebook moved 1.5bn non-EU accounts to the U.S. to limit exposure to GDPR. Facilities Users outside of the U.S. and Canada contract with Meta's Irish subsidiary, Meta Platforms Ireland Limited (formerly Facebook Ireland Limited), allowing Meta to avoid US taxes for all users in Europe, Asia, Australia, Africa and South America. Meta is making use of the Double Irish arrangement which allows it to pay 2–3% corporation tax on all international revenue. In 2010, Facebook opened its fourth office, in Hyderabad, India, which houses online advertising and developer support teams and provides support to users and advertisers. In India, Meta is registered as Facebook India Online Services Pvt Ltd. It also has offices or planned sites in Chittagong, Bangladesh; Dublin, Ireland; and Austin, Texas, among other cities. Facebook opened its London headquarters in 2017 in Fitzrovia in central London. Facebook opened an office in Cambridge, Massachusetts in 2018. The offices were initially home to the "Connectivity Lab", a group focused on bringing Internet access to those who do not have access to the Internet. In April 2019, Facebook opened its Taiwan headquarters in Taipei. In March 2022, Meta opened new regional headquarters in Dubai. In September 2023, it was reported that Meta had paid £149m to British Land to break the lease on Triton Square London office. Meta reportedly had another 18 years left on its lease on the site. As of 2023, Facebook operated 21 data centers. It committed to purchase 100% renewable energy and to reduce its greenhouse gas emissions 75% by 2020. Its data center technologies include Fabric Aggregator, a distributed network system that accommodates larger regions and varied traffic patterns. Reception US Representative Alexandria Ocasio-Cortez responded in a tweet to Zuckerberg's announcement about Meta, saying: "Meta as in 'we are a cancer to democracy metastasizing into a global surveillance and propaganda machine for boosting authoritarian regimes and destroying civil society ... for profit!'" Ex-Facebook employee Frances Haugen and whistleblower behind the Facebook Papers responded to the rebranding efforts by expressing doubts about the company's ability to improve while led by Mark Zuckerberg, and urged the chief executive officer to resign. In November 2021, a video published by Inspired by Iceland went viral, in which a Zuckerberg look-alike promoted the Icelandverse, a place of "enhanced actual reality without silly looking headsets". In a December 2021 interview, SpaceX and Tesla chief executive officer Elon Musk said he could not see a compelling use-case for the VR-driven metaverse, adding: "I don't see someone strapping a frigging screen to their face all day." In January 2022, Louise Eccles of The Sunday Times logged into the metaverse with the intention of making a video guide. She wrote: Initially, my experience with the Oculus went well. I attended work meetings as an avatar and tried an exercise class set in the streets of Paris. The headset enabled me to feel the thrill of carving down mountains on a snowboard and the adrenaline rush of climbing a mountain without ropes. Yet switching to the social apps, where you mingle with strangers also using VR headsets, it was at times predatory and vile. Eccles described being sexually harassed by another user, as well as "accents from all over the world, American, Indian, English, Australian, using racist, sexist, homophobic and transphobic language". She also encountered users as young as 7 years old on the platform, despite Oculus headsets being intended for users over 13. See also References External links 37°29′06″N 122°08′54″W / 37.48500°N 122.14833°W / 37.48500; -122.14833
========================================
[SOURCE: https://en.wikipedia.org/wiki/Conversion_to_Judaism] | [TOKENS: 8716]
Contents Conversion to Judaism Conversion to Judaism (Hebrew: גִּיּוּר, romanized: giyur or Hebrew: גֵּרוּת, romanized: gerut) is the process by which non-Jews adopt the Jewish religion and become members of the Jewish ethnoreligious community. It thus resembles both conversion to other religions and naturalization. The procedure and requirements for conversion depend on the sponsoring denomination. Furthermore, a conversion done in accordance with one Jewish denomination is not a guarantee of recognition by another denomination. Normally, though not always, the conversions performed by more stringent denominations are recognized by less stringent ones, but not the other way around. A formal conversion is also sometimes undertaken by individuals who are raised Jewish or have Jewish ancestry but who may not be considered Jewish according to stringent interpretations of traditional Jewish law. There are some groups that have adopted Jewish customs and practices. For example, in Russia the Subbotniks have adopted most aspects of Judaism without formal conversion to Judaism. However, if Subbotniks, or anyone without a formal conversion, wish to marry into a traditional Jewish community or immigrate to Israel under the Law of Return, they must have a formal conversion. Terminology The word ger comes from the Semitic root ג־ו־ר, which connotes living abroad. In the Hebrew Bible, a ger is a "foreigner" or "sojourner"; the latter is a foreigner who has settled inside Judah. Marc Angel writes: The Hebrew ger (in post-Biblical times translated as "proselyte") literally means "resident" and refers to a non-Israelite who lived among the Israelite community. When the Torah commands compassion and equal justice for the ger, it is referring to these "residents". Rabbinic tradition interpreted the word ger as referring to proselytes..." Angel's explanation of the literal meaning of "ger" as alien is borne out in biblical verses such as Lev 19:34: The stranger that sojourneth with you shall be unto you as the home-born among you, and thou shalt love him as thyself; for ye were strangers in the land of Egypt: I am the LORD your God. Another verse which has been interpreted as referring to non-Jews converting to Judaism is Esther 8:17, although no process is described. The word ger in Numbers 15 is rendered as prosílytos (Koine Greek: προσήλυτος) in the Septuagint and gəyurā (Jewish Palestinian Aramaic: גיורא) in Targum Onkelos on Numbers 15:15–16, which word in both cases denotes a convert to Judaism. A formal male convert to Judaism is referred to as a ger; the term for a woman convert is giyoret (גִּיוֹרֶת) in Numbers 15:15-16. In Rabbinic Judaism, a ger or giyoret is considered fully Jewish. In Karaite Judaism the term ger only refers to a non-Jew who has yet to convert, and once converted, is no longer called ger. In the Talmud, ger is used in two senses: ger tzedeq (גֵּר־צֶדֶק) "righteous convert" is a proselyte to Judaism, while a ger toshav "settled foreigner" is a Gentile inhabitant of the Land of Israel who observes the Seven Laws of Noah and has repudiated all links with idolatry. In Modern Hebrew, the unqualified term ger refers to a ger tzedeq. Overview According to Maimonides (Isurei Biah 13:14), converts were accepted since the beginning of Jewish history, and the foreign wives of Jewish leaders—such as Samson and Solomon—were converts. Yet he says (Isurei Biah 13:15) that in the times of Jewish political power, such as the days of Kings David (hypothesized to have been during the 11th or 10th centuries BCE) and Solomon (mid-10th century BCE), batei din (Jewish courts) did not accept converts who may have not had the right intention, and they had to wait and prove their intentions to be legally accepted. With the notable exception of some Syrian Jewish communities (primarily the Brooklyn, New York, and Deal, New Jersey, communities), all mainstream forms of Judaism today are open to sincere converts, with all denominations accepting converts converted within their denominations. The rules vary between denominations, as does the acceptance of converts from one denomination by another. For Rabbinic Judaism, the laws governing conversion (gerut) are based on codes of law and texts, including discussions in the Talmud, through the Shulhan Arukh and subsequent interpretations. (Many of the guidelines of accepting converts are based on the Book of Ruth and the manner whereby Ruth was brought into the fold through her mother-in-law, Naomi). These rules are held as authoritative by Orthodox Judaism and Conservative Judaism. In Judaism, proselytizing is discouraged, and religious gerut is somewhat discouraged. An ancient tradition called for a sponsoring rabbi to discourage potential converts three times. If the potential convert remained adamant in their desire to convert, the rabbi would then begin the process. This practice does not have any solid basis in the written text, and while it may have been the practice in some locations, it was not universal. The tradition is uncommon in modern practice. To convert, the candidate must have a circumcision if male, and immerse in the mikveh before a beth din comprising three Jewish men who are shomer Shabbat. There is also a requirement to accept the mitzvot (although not necessarily a commitment to keep them). Today, the process has become more centralized, with the conversion candidate having to convince a rabbi and the beth din of their sincerity, and there will usually be a considerable amount of study. In addition to studying, potential converts are typically expected to become involved in the Jewish community. This includes attending services, participating in holidays and rituals, and building relationships with other Jews in the community. They will then be tested and formally accepted, and the convert is issued a Shtar geirut ("Certificate of Conversion"). As the conversion process becomes more centralized in Israel, there are only a limited number of permanent conversion courts that are acceptable to the Chief Rabbinate of Israel. However, rabbis are willing to conduct decentralized conversions and are recognized by each other. Two of the more prominent of these rabbis are Chuck Davidson and Haim Amsalem. Conservative halakha takes a more lenient approach than Modern Orthodox Judaism. Its approach to the validity of conversions is based on whether the conversion procedure followed Rabbinic norms rather than the reliability of those performing it or the nature of the obligations the convert undertook. Accordingly, it may accept the validity of some Reform and Reconstructionist conversions, but only if they include immersion in a mikveh before a rabbinical court (beit din) and, for men, circumcision, or a symbolic circumcision for those already circumcised (hatafat dam brit). The requirements of Reform Judaism for conversions are extremely different and far more lenient. The denomination states that "people considering conversion are expected to study Jewish theology, rituals, history, culture, and customs, and to begin incorporating Jewish practices into their lives. The length and format of the course of study will vary from rabbi to rabbi and community to community. However, most now require a course in basic Judaism and individual study with a rabbi, as well as attendance at services and participation in home practice and synagogue life."[citation needed] Although an infant conversion might be accepted in some circumstances, such as in the case of adopted children or children whose parents convert, children who convert would typically be asked if they want to remain Jewish after reaching religious adulthood, which is 12 years of age for a girl and 13 for a boy. This standard is applied by Orthodox and Conservative Judaism, which accept halakha as binding and normative. Reconstructionist Judaism values the symbolism of the conversion ritual and encourages those who were not born of Jewish parents and who wish to convert to undergo this rite of passage. The Reconstructionist course of study for a prospective convert, which the rabbi and congregation determine the individual is working with, includes history, observance, beliefs, and learning to make educated choices. The completion of the process is marked by ritual immersion for men and women; circumcision or hatafat dam brit (a symbolic drop of blood) for men (unless there exists an extraordinary physical or emotional hazard); a valid beth din (in Reconstructionist conversions, dialogue with three "knowledgeable Jews", at least one of whom is a rabbi), and often a public welcoming ceremony. Karaite Judaism does not accept the Talmud and, therefore, has different requirements for conversion. Traditionally non-proselytizing, Karaite Judaism's long-standing abstention from conversions was recently lifted. On 1 August 2007, the Karaites reportedly converted their first new members in 500 years. At a ceremony in their Northern California synagogue, ten adults and four minors swore fealty to Judaism after completing a year of study. This conversion comes 15 years after the Karaite Council of Sages reversed its centuries-old ban on accepting converts. Humanistic Judaism postulates that "conversion" does not suit the process of becoming a Jew, as it implies a change in belief, which is not chosen like behavioral changes. The shift is better described as naturalization, affiliation, or adoption, reflecting alterations in family ties and cultural aspirations rather than fundamental belief changes. In ancient times In antiquity, conversion to Judaism appears to have been a voluntary and individual process, rather than the result of organized missionary efforts. While some non-Jews did convert—both men and women—because they found Judaism or elements of it appealing, no unambiguous evidence suggests that Jews actively sought to convert others. The question of Jewish missionary activity remains debated, but existing evidence does not support the notion that Jews deliberately approached non-Jews with the goal of turning them into Jews. Louis Feldman's views on active Jewish missionizing have changed.[further explanation needed] While viewing classical Judaism as being receptive to converts, especially from the second century BC through the first century AD, he points to a lack of either missionizing tracts or records of the names of rabbis who sought converts as evidence for the lack of active Jewish missionizing.: 205–06 Feldman maintains that conversion to Judaism was common and the Jewish population was large both within the Land of Israel and in the Diaspora.: 183–203, 206 According to Lester L. Grabbe, although there are "various references to proselytizing in the literature from the Greek and Roman periods", "it seems unlikely that there was a major 'mission' on the part of the Jews to gain Gentile converts." Other historians believe that conversion during the Roman era was limited in number and did not account for much of the Jewish population growth, due to various factors such as the illegality of male conversion to Judaism in the Roman world from the mid-second century. Another factor that made conversion difficult in the Roman world was the halakhic requirement of circumcision, a requirement that proselytizing Christianity quickly dropped. The Fiscus Judaicus, a tax imposed on Jews in 70 AD and relaxed to exclude Christians in 96 AD, also limited Judaism's appeal. According to The Jewish Encyclopedia article on circumcision, in the first century AD, before the Mishnah was edited, the requirement for circumcision of proselytes was an open issue between the Zealots and liberal parties in ancient Israel. Joshua ben Hananiah argued that besides accepting Jewish beliefs and laws, a prospective convert to Judaism must undergo immersion in a mikveh. In contrast, Eliezer ben Hurcanus makes circumcision a condition for the conversion. A similar controversy between the Shammaites and the Hillelites is given regarding a proselyte born without a foreskin: the former demanding the spilling of a drop of blood symbolic of the Brit Milah, thereby entering into the covenant; the latter declaring it to be unnecessary. In discussions about the necessity of circumcision for those born of a Jewish mother, lending some support to the need for circumcision of converts, the Midrash states: "If thy sons accept My Godhead [by undergoing circumcision] I shall be their God and bring them into the land; but if they do not observe My covenant in regard either to circumcision or to the Sabbath, they shall not enter the land of promise" (Midrash Genesis Rabbah xlvi). "The Sabbath-keepers who are not circumcised are intruders, and deserve punishment" (Midrash Deut. Rabbah i). However, the opposing view is supported in the Babylonian Talmud by Joshua ben Hananiah: "A male convert who has been immersed but not circumcised, or circumcised but not immersed, is a convert." Note this view is later rejected by the Talmud. Josephus in Antiquities of the Jews, Book 20 Chapter 2, recorded the story of King Izates of Adiabene who decided to follow the Law of Moses at the advice of a Jewish merchant named Ananias. He was going to get circumcised, but his mother, Helen, who herself embraced the Jewish customs, advised against it on the grounds that the subjects would not stand to be ruled by someone who followed such "strange and foreign rites". Ananias likewise advised against it, on the grounds that worship of God was superior to circumcision (Robert Eisenman in James the Brother of Jesus claims that Ananias is Paul the Apostle, who held similar views, although this is a novel interpretation lacking support in mainstream scholarship) and that God would forgive him for fear of his subjects. So Izates decided against it. However, later, "a certain other Jew that came out of Galilee, whose name was Eleazar," who was well versed in the Law, convinced him that he should, on the grounds that it was one thing to read the Law and another thing to practice it, and so he did. Once Helen and Ananias found out, they were struck by great fear of the possible consequences, but as Josephus put it, God looked after Izates. As his reign was peaceful and blessed, Helen visited the Second Temple to thank God, and since there was a terrible famine at the time, she brought much food and aid to the people of Jerusalem. Requirements The Amoraim who produced the Talmud set out basic requirements for conversion to Judaism (Keritot 8b), which must be witnessed and affirmed by a beth din (a rabbinical court composed of three Jewish males above the age of Bar Mitzvah). The judges on the Beth Din should be observant of Jewish law. Common Orthodox practice is for all of the judges to be Rabbis or Orthodox clergy. Today conversion requirements and the time required to complete conversion differ according to denomination and rabbinic sponsor. The basic requirements set out in the Talmud include: The consensus of halakhic authorities also requires a convert to understand and formally accept the duties of classical Jewish law. This is not stated explicitly in the Talmud, but was inferred by subsequent commentators. After confirming that all these requirements have been met, the beth din issues a "Certificate of Conversion" (Shtar Giur), certifying that the person is now a Jew. Modern practice The requirements for conversions vary somewhat within the different branches of Judaism, so whether or not a conversion is recognized by another denomination is often an issue fraught with religious politics. The Orthodox rejection of non-Orthodox conversions is derived less from qualms with the conversion process itself, since Conservative and even some Reform conversions are very similar to Orthodox conversions with respect to duration and content, but rather from that the Orthodox presumption that the convert was not properly instructed in Jewish Law to Orthodox Jewish standards. The conflicting interpretations of whether non-Orthodox conversions are considered valid also have implications for converts aiming to acquire Israeli citizenship as personal status in Israel is heavily influenced by the decisions of the Great Rabbinical Court in Israel, which rejects non-Orthodox conversions. Furthermore, there have been arguments made by scholars such as Hacker which argue that modern conversions are significantly influenced by the gender of the convert. This is due to the jurisdiction of the Great Rabbinical Court in Israel regarding personal status of Israeli citizens, which does not recognize inter-faith marriages and does not recognize children of paternal Jews as Jewish if the mother is not or has not converted to Orthodox Judaism. In general, immersion in the mikveh is an important part of a traditional conversion. If the person who is converting is male, circumcision is a part of the traditional conversion process as well. If the male who is converting has already been circumcised, then a ritual removal of a single drop of blood will take place (hatafat dam brit). However, more liberal branches of Judaism have a more relaxed requirement of immersion and circumcision. Maturity Someone who converts as a minor (younger than 12 for a girl and 13 for a boy) is required to fulfill the requirements of conversion, that is circumcision and mikvah, but are not required to perform an 'acceptance of the mitzvoth'. The conversion instead is done al daat beth din, i.e. the acceptance is done by the Beth Din presiding over the conversion. The child lives as a Jew until their bar/bat mitzvah and they then have the option of rejecting their conversion. Once they have accepted to continue as a Jew the conversion can no longer be rejected. Reform Jewish views In the United States, Reform Judaism rejects the concept that any rules or rituals should be considered necessary for conversion to Judaism. In the late 19th century, the Central Conference of American Rabbis, the official body of American Reform rabbis, formally resolved to permit the admission of converts "without any initiatory rite, ceremony, or observance whatsoever." (CCAR Yearbook 3 (1893), 73–95; American Reform Responsa (ARR), no. 68, at 236–237.) Although this resolution has often been examined critically by many Reform rabbis, the resolution still remains the official policy of American Reform Judaism (CCAR Responsa "Circumcision for an Eight-Year-Old Convert" 5756.13 and Solomon Freehof, Reform Responsa for Our Time, no. 15.) Thus, American Reform Judaism does not require ritual immersion in a mikveh, circumcision, or acceptance of mitzvot as normative. Appearance before a Beth Din is recommended, but is not considered necessary. Converts are asked to commit to religious standards set by the local Reform community. In actual practice, the requirements for conversion of any individual are determined by the Rabbi who sponsors the convert. Typically, Reform Rabbis require prospective converts to take a course of study in Judaism, such as an "Introduction to Judaism" course, to participate in worship at a synagogue, and to live as a Jew (however that is interpreted by the individual Rabbi) for a period of time. A period of one year is common, although individual Rabbis' requirements vary. When the sponsoring Rabbi feels that the candidate is ready, a Beth Din may be convened. Other rituals such as immersion in a mikvah, circumcision (or Hatafat dam brit), and a public ceremony to celebrate the conversion, are also at the discretion of the Rabbi.[failed verification] Interdenominational views In response to the tremendous variations that exist within the Reform community, the Conservative Jewish movement attempted to set a nuanced approach. The Conservative Committee on Jewish Law and Standards has issued a legal opinion stating that Reform conversions may be accepted as valid only when they include the minimal Conservative halachic requirements of milah and t'vilah, appearance before a Conservative Beth Din, and a course of Conservative study. (Proceedings of Committee on Jewish Law and Standards: 1980–1985, pp. 77–101.) In general, branches of Orthodox Judaism consider non-Orthodox conversions either inadequate or of questionable halachic compliance, and such conversions are therefore not accepted by these branches of Judaism. Conversely, both Conservative and Reform Judaism accept the Orthodox conversion process as being valid. Since 2008, Haredi Orthodox religious courts in Israel have been rejecting conversions from a number of Orthodox rabbis, since the Chief Rabbinate do not accept the authority of the presiding rabbis. Intra-Orthodox controversy In 2008, a Haredi-dominated Badatz in Israel annulled thousands of conversions performed by the Military Rabbinate in Israel. The Chief Rabbinate of Israel, which is the only state-recognized authority on religious matters, backed by Rabbi Ovadia Yosef, ruled against this, making the annulment legally invalid for purposes of Israeli law. Karaite views As of 2006, the Moetzet Hakhamim (Council of Sages) began to accept converts to Karaite Judaism through the Karaite Jewish University. The process requires one year of learning, circumcision (for males), and the taking of the vow that Ruth took: "For whither thou goest, I will go; and where thou lodgest, I will lodge; thy people shall be my people, and thy God my God; where thou diest, will I die, and there will I be buried; the LORD do so to me, and more also, if aught but death part thee and me." Ruth 1:16–17 Attempts to resolve the "Who is a Jew?" issue In the 1950s Rabbi Joseph Soloveitchik and other members of the Rabbinical Council of America engaged in a series of private negotiations with the leaders of Conservative Judaism's Rabbinical Assembly, including Saul Lieberman; their goal was to create a joint Orthodox-Conservative national beth din for all Jews in the United States. It would create communal standards of marriage and divorce. It was to be modeled after the Israeli Chief Rabbinate, where all the judges would have been Orthodox, while it would have been accepted by the larger Conservative movement as legitimate. Conservative rabbis in the Rabbinical Assembly created a Joint Conference on Jewish Law, devoting a year to this effort.[citation needed] For a number of reasons, the project did not succeed. According to Orthodox Rabbi Louis Bernstein, the major reason for its failure was the Orthodox rabbis' insistence that the Conservative Rabbinical Assembly agree to expel Conservative rabbis for actions they took prior to the formation of the new beth din, and the RA refused to do so. According to Orthodox Rabbi Emanuel Rackman, former president of the RCA, the major reason for its failure was pressure from haredi Orthodox rabbis, who held that any cooperation between Orthodoxy and Conservatism was forbidden. In 1956, Rabbi Harry Halpern, of the Joint Conference wrote a report on the demise of this beth din. He writes that negotiations between the Orthodox and Conservative denominations were completed and agreed upon, but then a new requirement was demanded by the RCA: The RA must "impose severe sanctions" upon Conservative rabbis for actions they took before this new beth din was formed. Halpern writes that the RA "could not assent to rigorously disciplining our members at the behest of an outside group." He goes on to write that although subsequent efforts were made to cooperate with the Orthodox, a letter from eleven Rosh Yeshivas was circulated declaring that Orthodox rabbis are forbidden to cooperate with Conservative rabbis. In Denver, Colorado, a joint Orthodox, Traditional, Conservative and Reform Bet Din was formed to promote uniform standards for conversion to Judaism. A number of rabbis were Orthodox and had semicha from Orthodox yeshivas, but were serving in synagogues without a mechitza; these synagogues were called traditional Judaism. Over a five-year period they performed some 750 conversions to Judaism. However, in 1983 the joint Beth Din was dissolved, due to the unilateral American Reform Jewish decision to change the definition of Jewishness: The move was precipitated by the resolution on patrilineality adopted that year by the Central Conference of American Rabbis. This decision to redefine Jewish identity, as well as the designation of Denver as a pilot community for a new Reform out reach effort to recruit converts, convinced the Traditional and Conservative rabbis that they could no longer participate in the joint board...the national decision of the Reform rabbinate placed the Traditional and Conservative rabbis in an untenable position. They could not cooperate in a conversion program with rabbis who held so different a conception of Jewish identity. And furthermore, they could not supervise conversions that would occur with increasing frequency due to a Reform outreach effort that was inconsistent with their own understanding of how to relate to potential proselytes. — Wertheimer, A People Divided Specifically, in 1983, the Central Conference of American Rabbis passed a resolution waiving the need for formal conversion for anyone with at least one Jewish parent who has made affirmative acts of Jewish identity. This departed from the traditional position requiring formal conversion to Judaism for children without a Jewish mother. The 1983 resolution of the American Reform movement has had a mixed reception in Reform Jewish communities outside of the United States. Most notably, the Israel Movement for Progressive Judaism has rejected patrilineal descent and requires formal conversion for anyone without a Jewish mother. However, in 2015 the majority of Britain's Assembly of Reform Rabbis voted in favor of a position paper proposing "that individuals who live a Jewish life, and who are patrilineally Jewish, can be welcomed into the Jewish community and confirmed as Jewish through an individual process." Britain's Assembly of Reform Rabbis stated that rabbis "would be able to take local decisions – ratified by the Beit Din – confirming Jewish status." The end of the joint Beth Din program was welcomed by Haredi Orthodox groups, who saw the program as illegitimate. Further, Haredi groups attempted to prevent non-Orthodox rabbis from following the traditional requirements of converts using a mikveh. In the Haredi view, it is better to have no conversion at all than a non-Orthodox conversion, as all non-Orthodox conversions are not true conversions at all according to them. In the 1980s Modern Orthodox Rabbi Norman Lamm, Rosh Yeshiva of Yeshiva University, along with other American and Israeli Orthodox rabbis, worked with Conservative and Reform rabbis to come up with solution to the "Who is a Jew?" issue. In 1989 and 1990 Israeli Prime Minister Yitzhak Shamir spearheaded an effort to find a way to resolve the impasse.[citation needed] A plan was developed by Israeli Cabinet Secretary Elyakim Rubenstein, who negotiated secretly for many months with rabbis from Conservative, Reform and Orthodox Judaism, including faculty at Yeshiva University, with Lamm as Rosh Yeshiva. They were planning to create a joint panel that interviewed people who were converting to Judaism and considering making aliyah (moving to the State of Israel), and would refer them to a beth din that would convert the candidate following traditional halakha. All negotiating parties came to agreement:[citation needed] Many Reform rabbis took offense at the notion that the beth din must be strictly halakhic and Orthodox, but they acquiesced. However, when word about this project became public, a number of leading haredi rabbis issued a statement denouncing the project, condemning it as a "travesty of halakha". Rabbi Moshe Sherer, Chairman of Agudath Israel World Organization, stated that "Yes we played a role in putting an end to that farce, and I'm proud we did." Norman Lamm condemned this interference by Sherer, stating that this was "the most damaging thing that he [Sherer] ever did in his forty year career." Rabbi Lamm wanted this to be only the beginning of a solution to Jewish disunity. He stated that had this unified conversion plan not been destroyed, he wanted to extend this program to the area of halakhic Jewish divorces, thus ending the problem of mamzerut. In 1987, American-born British rabbi, Sidney Brichto, of the country's Liberal Judaism movement, published widely-discussed proposals for a historic compromise between progressive streams of Judaism and Orthodox Judaism. He advocated for the Orthodox Beit Din to oversee contentious areas. In return, progressive rabbis would earn respect from the Orthodox rabbinate, a degree of recognition and a role in Beit Din processes concerning progressive Jewry. Brichto's proposals encouraged rabbi John Levi to support such an initiative in Melbourne. Among Brichto's proposals, progressive streams of Judaism would stop processing their own conversions to Judaism. Instead, their prospective converts would have their status conferred on them by an Orthodox Beit Din. The Beit Din would be expected to show more leniency than usual, but only expecting that those before them demonstrate knowledge of Orthodox practice rather than observance. The proposal was rejected by Immanuel Jakobovits, Baron Jakobovits, then Chief Rabbi of the United Hebrew Congregations of the Commonwealth. Jakobovits reasoned: "How can an Orthodox Beth Din validate a conversion without kabbalat mitzvot [acceptance of the commandments]?" However, in 1990, the Chief Rabbi-elect, Jonathan Sacks was more favourable to the proposal. In a letter to Brichto, he wrote: "As soon as I read your article... I called it publicly 'the most courageous statement by a non-Orthodox Jew this century'. I felt it was a genuine way forward. Others turned out not to share my view." He continued: "It will be a while - 18 months - before I take up office. But I believe we can still explore that way forward together. For if we do not move forward, I fear greatly for our community and for Am Yisrael." In 1997 the issue of "Who is a Jew?" again arose in the State of Israel, and Orthodox leaders such as Rabbi Norman Lamm publicly backed the Neeman commission, a group of Orthodox, Conservative and Reform rabbis working to develop joint programs for conversion to Judaism. In 1997 Lamm gave a speech at the World Council of Orthodox Leadership, in Glen Springs, New York, urging Orthodox Jews to support this effort: Lamm told his listeners that they should value and encourage the efforts of non-Orthodox leaders to more seriously integrate traditional Jewish practices into the lives of their followers. They should welcome the creation of Reform and Conservative day schools and not see them as a threat to their own, Lamm said. In many communities, Orthodox day schools, or Orthodox-oriented community day schools, have large numbers of students from non-Orthodox families. The liberal movements should be appreciated and encouraged because they are doing something Jewish, even if it is not the way that Orthodox Jews would like them to, he said. "What they are doing is something, and something is better than nothing," he said in his speech. "I'm very openly attacking the notion that we sometimes find in the Orthodox community that 'being a goy is better'" than being a non-Orthodox Jew, he said in an interview. The committee recommended the establishment of a joint institute for Jewish studies, which would be a joint effort by all three streams of Judaism. The committee also recommended that conversion proceedings themselves be held in special conversion courts, to be recognized by all denominations in Judaism. The purpose of the proposal was to prevent a rift in the Jewish people, while at the same time bringing about a state-sponsored arrangement for conversion. On 7 September 1998, the government adopted the Ne'eman Commission Report. A year later, the Joint Institute for Jewish Studies was established, and since then it has been the official state operator of conversion courses in Israel, including the military conversion courses. In 2015 the institute's name was changed to Nativ – The National Center for Jewish Studies, Identity and Conversion. A recent development has been the idea of annulling conversions to Judaism, sometimes many years after they have taken place, due to a reduction in religious observance or change of community by the convert. Chuck Davidson, a Modern Orthodox expert on this conversion crisis explains "From the Middle Ages onwards, the greatest of the rabbis wrote explicitly that even if immediately after the conversion the convert goes off to worship idols, the person is still considered Jewish." The justification given for the change in approach is that the original conversion must never have been valid in the first place as it is clear from the convert's subsequent actions they were insincere at the time of conversion. A situation of confusion in Jewish identity in Israel was made worse when Haredi Rabbi Avraham Sherman of Israel's supreme religious court (בית הדין הרבני הגדול) called into question the validity of over 40,000 Jewish conversions when he upheld a ruling by the Ashdod Rabbinical Court to retroactively annul the conversion of a woman who came before them because in their eyes she failed to observe Jewish law. This crisis deepened when Israel's Rabbinate called into question the validity of soldiers who had undergone conversion in the army, meaning a soldier killed in action could not be buried according to Jewish law. In 2010, the rabbinate created a further distrust in the conversion process when it began refusing to recognize orthodox converts from the United States as Jewish. Indeed, the great-niece of the renowned Zionist Nahum Sokolow was recently deemed "not Jewish enough" to marry in Israel, after she failed to prove the matrilineal Jewish descent for four generations. Following a scandal in which U.S. Rabbi Barry Freundel was arrested on charges of installing hidden cameras in a mikveh to film women converts undressing, the Israeli Chief Rabbinate said it would review the validity of all past conversions performed by Freundel, then quickly reversed its decision, clarifying that it was joining the Orthodox Rabbinical Council of America in affirming the validity of the conversions. In December 2014 an Israeli court decided that a conversion could be annulled. In his decision Justice Neal Hendel wrote: "Just as the civil court has the inalienable authority to reverse – in extremely rare cases – a final judgment, so too does the special religious conversion court. For otherwise, we would allow for judgments that are flawed from their inception to exist eternally." Consequences Once undergone, a valid religious conversion to Judaism cannot be overturned. However, a Beth Din may determine that the conversion is void as it was never undertaken correctly in the first place. For example, if the rite of mikveh was performed incorrectly. In recent years, many Orthodox conversions have been overturned. In 2008 Israel's highest religious court invalidated the conversion of 40,000 Jews, mostly from Russian immigrant families, even though they had been approved by an Orthodox rabbi. Debate on what constitutes a valid Beth Din for conversion and for annulling conversions has caused divisions in the Orthodox world. It is an implicit judgment on the character and uprightness of the rabbis in that religious court. For example, when Rabbi Barry Freundel was arrested on charges of voyeurism for filming women converts at the mikveh he supervised, Israel's Chief Rabbinate initially threatened to review and possibly invalidate the conversions Freundel had been involved in approving. A crisis between American and Israeli rabbis was averted when the Chief Rabbinate agreed that all conversions completed by Freundel would be considered valid. Judaism is not an openly proselytizing religion. Judaism teaches that the righteous of all nations have a place in the afterlife. Much like in the other Abrahamic faiths, Jewish law requires the sincerity of a potential convert. In view of the foregoing considerations, most authorities are very careful about it. Essentially, they want to be sure that the convert knows what they are getting into, and that they are doing it for sincerely religious reasons. However, while conversion for the sake of love for Judaism is considered the best motivation, a conversion for the sake of avoiding intermarriage is gaining acceptance also. There is a tradition that a prospective convert should be turned away three times as a test of sincerity, though most rabbis no longer follow the tradition. Neither the Rabbinical Council of America nor the Rabbinical Assembly, the leading American Orthodox and Conservative organizations, suggest taking this action in their conversion policies, with the Central Conference of American Rabbis (CCAR) and Union for Reform Judaism (URJ) actively opposing its practice. Halakha forbids the mistreatment of a convert, including reminding a convert that they were once not a Jew.[citation needed] Hence, little to no distinction is made in Judaism between those who are born Jewish and those who are Jewish as a result of conversion. However, despite Halakha protecting the rights of converts, some Jewish communities have been accused of treating converts as second-class Jews. For example, many communities of Syrian Jews have banned conversion and refuse to recognise any Jewish conversion, including those done under Orthodox auspices (possibly influenced by sects in Syria like the Druze which do not accept converts). According to Orthodox interpretations of Halakha, converts face a limited number of restrictions. A marriage between a female convert and a kohen (members of the priestly class) is prohibited and any children of the union do not inherit their father's kohen status. While a Jew by birth may not marry a mamzer, a convert can. Descendants of converts can become rabbis. For instance, Rabbi Meir Baal Ha Nes is thought to be a descendant of a proselyte. Rabbi Akiva was also a very well-known son of converts. The Talmud lists many of the Jewish nation's greatest individuals who had either descended from or were themselves converts. Asenath, the wife of Joseph (son of Jacob), is mentioned as a possible convert. There are Midrash attesting to her conversion along with other women. This includes Hagar, Zipporah, Shiphrah, Puah, the Daughter Of Pharaoh, Rahab, Ruth, and Jael. In fact, King David is descended from Ruth, a convert to Judaism. (Ruth 4:13–22) In Orthodox and Conservative communities that maintain tribal distinctions, converts become Yisraelim (Israelites), ordinary Jews with no tribal or inter-Jewish distinctions. Converts typically follow the customs of their congregations. So, a convert who prays at a Sephardi synagogue would follow Sephardi customs and learn Sephardi Hebrew.[citation needed] A convert chooses his or her own Hebrew first name upon conversion but is traditionally known as the son or daughter of Abraham and Sarah, the first patriarch and matriarch in the Torah, often with the additional qualifier of "Avinu" (our father) and "Imenu" (our mother). Hence, a convert named Akiva would be known, for ritual purposes in a synagogue, as "Akiva ben Avraham Avinu"; in cases where the mother's name is used, such as for the prayer for recovery from an illness, he would be known as "Akiva ben Sarah Imenu". Talmudic opinions on converts are numerous; some positive, some negative. A quote from the Talmud labels the convert "hard on Israel as a scab". Many interpretations explain this quote as meaning converts can be unobservant and lead Jews to be unobservant or converts can be so observant that born Jews feel ashamed. Jews by choice The term "Jews by choice" is often used to describe people who chose to convert to Judaism, many of whom have no ancestral connection to the Jewish people. It is often contrasted with such terms as "Jew by birth" (or "Jew by chance"). The practice of conversion to Judaism is sometimes understood within Orthodox Judaism in terms of reincarnation. According to this school of thought in Judaism, when non-Jews are drawn to Judaism, it is because they had been Jews in a former life. Such souls may "wander among nations" through multiple lives, until they find their way back to Judaism, including through finding themselves born in a gentile family with a "lost" Jewish ancestor.[better source needed] Bnei Anusim In recent decades, there has been a renewed Jewish conversion interest with some Bnei Anusim, that is, the descendants of Jews who were forced to convert to other faiths. The Hebrew term for forced converts is "Anusim" (lit. "forced [converts]"), while the descendants of said converts are called "Bnei Anusim" (lit. "children of forced [converts]"). In the modern era, the single most notable and numerous group of Bnei Anusim converts are the Sephardic Bnei Anusim, descendants of those Sephardic Jews who were forced to convert to Christianity during the Spanish and Portuguese Inquisition. They are found throughout Iberia (Spain and Portugal) and Iberoamerica (the Hispanic countries of the Americas plus Brazil). There has been a continuous steady growth among them who are now prospective converts, actively seeking conversions back to Judaism.[citation needed] Since many Bnei Anusim (i.e. descendants of forced converts) lack an unbroken matrilineal Jewish line of descent or lack satisfactory documentary evidence to that effect (even if they can prove Jewish ancestry along one or all other of their lineages besides their direct matrilineal lineage), conversion has been a growing option for them to return to Judaism. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-384] | [TOKENS: 12858]
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Integer_(computer_science)] | [TOKENS: 2272]
Contents Integer (computer science) In computer science, an integer is a datum of integral data type, a data type that represents some range of mathematical integers. Integral data types may be of different sizes and may or may not be allowed to contain negative values. Integers are commonly represented in a computer as a group of binary digits (bits). The size of the grouping varies so the set of integer sizes available varies between different types of computers. Computer hardware nearly always provides a way to represent a processor register or memory address as an integer. Value and representation The value of an item with an integral type is the mathematical integer that it corresponds to. Integral types may be unsigned (capable of representing only non-negative integers) or signed (capable of representing negative integers as well). An integer value is typically specified in the source code of a program as a sequence of digits optionally prefixed with + or −. Some programming languages allow other notations, such as hexadecimal (base 16) or octal (base 8). Some programming languages also permit digit group separators. The internal representation of this datum is the way the value is stored in the computer's memory. Unlike mathematical integers, a typical datum in a computer has some minimal and maximum possible value. The most common representation of a positive integer is a string of bits, using the binary numeral system. The order of the memory bytes storing the bits varies; see endianness. The width, precision, or bitness of an integral type is the number of bits in its representation. An integral type with n bits can encode 2n numbers; for example an unsigned type typically represents the non-negative values 0 through 2n − 1. Other encodings of integer values to bit patterns are sometimes used, for example binary-coded decimal or Gray code, or as printed character codes such as ASCII. There are four well-known ways to represent signed numbers in a binary computing system. The most common is two's complement, which allows a signed integral type with n bits to represent numbers from −2(n−1) through 2(n−1) − 1. Two's complement arithmetic is convenient because there is a perfect one-to-one correspondence between representations and values (in particular, no separate +0 and −0), and because addition, subtraction and multiplication do not need to distinguish between signed and unsigned types. Other possibilities include offset binary, sign-magnitude, and ones' complement. Some computer languages define integer sizes in a machine-independent way; others have varying definitions depending on the underlying processor word size. Not all language implementations define variables of all integer sizes, and defined sizes may not even be distinct in a particular implementation. An integer in one programming language may be a different size in a different language, on a different processor, or in an execution context of different bitness; see § Words. Some older computer architectures used decimal representations of integers, stored in binary-coded decimal (BCD) or other format. These values generally require data sizes of 4 bits per decimal digit (sometimes called a nibble), usually with additional bits for a sign. Many modern CPUs provide limited support for decimal integers as an extended datatype, providing instructions for converting such values to and from binary values. Depending on the architecture, decimal integers may have fixed sizes (e.g., 7 decimal digits plus a sign fit into a 32-bit word), or may be variable-length (up to some maximum digit size), typically occupying two digits per byte (octet). Common integral data types IPv6 addresses, GUIDs Different CPUs support different integral data types. Typically, hardware will support both signed and unsigned types, but only a small, fixed set of widths. The table above lists integral type widths that are supported in hardware by common processors. High-level programming languages provide more possibilities. It is common to have a 'double width' integral type that has twice as many bits as the biggest hardware-supported type. Many languages also have bit-field types (a specified number of bits, usually constrained to be less than the maximum hardware-supported width) and range types (that can represent only the integers in a specified range). Some languages, such as Lisp, Smalltalk, REXX, Haskell, Python, and Raku, support arbitrary precision integers (also known as infinite precision integers or bignums). Other languages that do not support this concept as a top-level construct may have libraries available to represent very large numbers using arrays of smaller variables, such as Java's java.math.BigInteger class or Perl's "bigint" package. These use as much of the computer's memory as is necessary to store the numbers; however, a computer has only a finite amount of storage, so they, too, can only represent a finite subset of the mathematical integers. These schemes support very large numbers; for example one kilobyte of memory could be used to store numbers up to 2466 decimal digits long. A Boolean type is a type that can represent only two values: 0 and 1, usually identified with false and true respectively. This type can be stored in memory using a single bit, but is often given a full byte for convenience of addressing and speed of access. A four-bit quantity is known as a nibble (when eating, being smaller than a bite) or nybble (being a pun on the form of the word byte). One nibble corresponds to one digit in hexadecimal and holds one digit or a sign code in binary-coded decimal. The term byte initially meant 'the smallest addressable unit of memory'. In the past, 5-, 6-, 7-, 8-, and 9-bit bytes have all been used. There have also been computers that could address individual bits ('bit-addressed machine'), or that could only address 16- or 32-bit quantities ('word-addressed machine'). The term byte was usually not used at all in connection with bit- and word-addressed machines. The term octet always refers to an 8-bit quantity. It is mostly used in the field of computer networking, where computers with different byte widths might have to communicate. In modern usage byte almost invariably means eight bits, since all other sizes have fallen into disuse; thus byte has come to be synonymous with octet. The term 'word' is used for a small group of bits that are handled simultaneously by processors of a particular architecture. The size of a word is thus CPU-specific. Many different word sizes have been used, including 6, 8, 12, 16, 18, 24, 32, 36, 39, 40, 48, 60, and 64 bits. Since it is architectural, the size of a word is usually set by the first CPU in a family, rather than the characteristics of a later compatible CPU. The meanings of terms derived from word, such as longword, doubleword, quadword, and halfword, also vary with the CPU and OS. Practically all new desktop processors are capable of using 64-bit words, though embedded processors with 8- and 16-bit word size are still common. The 36-bit word length was common in the early days of computers. One important cause of non-portability of software is the incorrect assumption that all computers have the same word size as the computer used by the programmer. For example, if a programmer using the C language incorrectly declares as int a variable that will be used to store values greater than 215−1, the program will fail on computers with 16-bit integers. That variable should have been declared as long, which has at least 32 bits on any computer. Programmers may also incorrectly assume that a pointer can be converted to an integer without loss of information, which may work on (some) 32-bit computers, but fail on 64-bit computers with 64-bit pointers and 32-bit integers. This issue is resolved by C99 in stdint.h in the form of intptr_t. The bitness of a program may refer to the word size (or bitness) of the processor on which it runs, or it may refer to the width of a memory address or pointer, which can differ between execution modes or contexts. For example, 64-bit versions of Microsoft Windows support existing 32-bit binaries, and programs compiled for Linux's x32 ABI run in 64-bit mode yet use 32-bit memory addresses. The standard integer size is platform-dependent. In C, it is denoted by int and required to be at least 16 bits. Windows and Unix systems have 32-bit ints on both 32-bit and 64-bit architectures. A short integer can represent a whole number that may take less storage, while having a smaller range, compared with a standard integer on the same machine. In C, it is denoted by short. It is required to be at least 16 bits, and is often smaller than a standard integer, but this is not required. A conforming program can assume that it can safely store values between −(215 − 1) and 215 − 1, but it may not assume that the range is not larger. In Java, a short is always a 16-bit integer. In the Windows API, the datatype SHORT is defined as a 16-bit signed integer on all machines. A long integer can represent an integer whose range is greater than or equal to that of a standard integer on the same machine. In C, it is denoted by long. It is required to be at least 32 bits, and may or may not be larger than a standard integer. A conforming program can assume that it can safely store values between −(231 − 1) and 231 − 1, but it may not assume that the range is not larger. In the C99 version of the C programming language and the C++11 version of C++, a long long type is supported that has double the minimum capacity of the standard long. This type is not supported by compilers that require C code to be compliant with the previous C++ standard, C++03, because the long long type did not exist in C++03. For an ANSI/ISO compliant compiler, the minimum requirements for the specified ranges, that is, −(263 − 1) to 263 − 1 for signed and 0 to 264 − 1 for unsigned, must be fulfilled; however, extending this range is permitted. This can be an issue when exchanging code and data between platforms, or doing direct hardware access. Thus, there are several sets of headers providing platform independent exact width types. The C standard library provides <stdint.h>; this was introduced in C99 and C++11. Syntax Integer literals can be written as regular Arabic numerals, consisting of a sequence of digits and with negation indicated by a minus sign before the value. However, most programming languages disallow use of commas or spaces for digit grouping. Examples of integer literals are: There are several alternate methods for writing integer literals in many programming languages: Extreme values In many programming languages, there exist predefined constants representing the least and the greatest values representable with a given integer type. Names for these include See also Notes References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Orders_of_magnitude_(temperature)] | [TOKENS: 64]
Contents Orders of magnitude (temperature) List of orders of magnitude for temperature See detailed list below Detailed list for 100 K to 1000 K Most ordinary human activity takes place at temperatures of this order of magnitude. Circumstances where water naturally occurs in liquid form are shown in light grey. SI multiples References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Animal_tale] | [TOKENS: 650]
Contents Animal tale An animal tale or beast fable generally consists of a short story or poem in which animals talk. They may exhibit other anthropomorphic qualities as well, such as living in a human-like society. It is a traditional form of allegorical writing. Animal tales can be understood in universal terms of how animal species relate to each other (for example, predators wishing to eat prey), rather than human groups in a specific society. Thus, readers are able to understand characters' motives, even if they do not come from the same cultural background as the author. Animal tales can be appreciated in times and locations far removed from their origins. History Important traditions in beast fables are represented by the Panchatantra and Kalila and Dimna (Sanskrit and Arabic originals), Aesop (Greek original), One Thousand and One Nights (Arabian Nights) and separate trickster traditions (West African and Native American). The medieval French cycle of allegories, Roman de Reynart is called a beast-epic, with the recurring figure Reynard the Fox. Beast fables are commonly translated between languages and often used for educational purposes. For example, Latin versions of Aesop's Fables were standard educational material in the European Middle Ages, over a millennium after they were written. Because of their lack of human social context, animal tales can readily spread from culture to culture. The Uncle Remus stories introduced African-style trickster character Br'er Rabbit to American culture. Br'er Rabbit is smaller and weaker than most characters he encounters, but defeats them with cleverness, similar to tricksters of African folklore, such as Anansi. 20th century First published in 1902, the Peter Rabbit books follow various animal characters and are each intended to teach a particular moral to children. The Wind in the Willows (1908) is another British children's novel of the era. In the 1945 English novel Animal Farm, various political ideologies are personified as animals, such as the Stalinist Napoleon Pig, and the numerous "sheep" that followed his directions without question. Rather than being a story for children, this book was intended for adults attempting to understand the new political landscape during the post-World War II Red Scare. Post-war English examples of the genre include the "Uncle" series (1964–1973) by J. P. Martin, and the novels of Richard Adams, most notably Watership Down (1972). 21st century Many modern books, films, and video games can be considered animal tales. In American cinema, there is also the Academy Award-winning film Zootopia, which serves as a fable about prejudice and stereotypes where the talking animal characters experience both social problems with their species serving as an analogy to racial groups. The 2017 video game Night in the Woods has been cited as an allegory for becoming an adult,[unreliable source?] as well as for late-stage capitalism. Aggretsuko, a 2016 anime, features talking animal characters and examines themes such as misogyny and workplace anxiety.[unreliable source?] Cartoons and other media featuring talking animals are central to the furry fandom subculture. Notes Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/Formal_organization] | [TOKENS: 739]
Contents Formal organization A formal organization is an organization with a fixed set of rules of intra-organization procedures and structures. As such, it is usually set out in writing, with a language of rules that ostensibly leave little discretion for interpretation. Sociologist Max Weber devised a model of formal organization known as the bureaucratic model that is based on the rationalization of activities through standards and procedures. It is one of the most applied formal organization models. In some societies and in some organizations, such rules may be strictly followed; in others, they may be little more than an empty formalism. Distinction from informal organization Formal rules are often adapted to subjective interests—social structures within an enterprise and the personal goals, desires, sympathies and behaviors of the individual workers—so that the practical everyday life of an organization becomes informal. Practical experience shows no organization is ever completely rule-bound: instead, all real organizations represent some mix of formal and informal. Consequently, when attempting to legislate for an organization and to create a formal structure, it is necessary to recognize informal organization in order to create workable structures. However, informal organization can fail, or, if already set in order, can work against mismanagement. Formal organizations are typically understood to be systems of coordinated and controlled activities that arise when work is embedded in complex networks of technical relations and boundary-spanning exchanges. But in modern societies, formal organizational structures arise in highly institutional contexts. Organizations are driven to incorporate the practices and procedures defined by prevailing rationalized concepts of organizational work and institutionalized in society. Organizations that do so increase their legitimacy and their survival prospects, independent of the immediate efficacy of the acquired practices and procedures. There can develop a tension between on the one hand, the institutionalized products, services, techniques, policies, and programs that function as myths (and may be ceremonially adopted), and efficiency criteria on the other hand. To maintain ceremonial conformity, organizations that reflect institutional rules tend to buffer their formal structures from the uncertainties of the technical activities by developing a loose coupling between their formal structures and actual work activities. - (John Meyer and Brian Rowan, 1976) Identification numbers and public registers In some countries, formal organizations are registered in public registers to make their identification easier even if an organization renames. Examples of organization identifiers: The Hawthorne experiments The deviation from rule-making on a higher level was documented for the first time in the Hawthorne studies (1924–1932) and called informal organization. At first this discovery was dismissed as the product of avoidable errors, until it finally had to be recognized that these unwritten laws of work of everyday life often had more influence on the fate of the enterprise than those conceived on organizational charts of the executive level. Numerous empirical studies in sociological organization research followed, ever more clearly providing evidence for this, particularly during the human relations movement. It is important to analyze informal structures within an enterprise to make use of positive innovations, but also to be able to do away with bad habits that have developed over time. Reasons for informal organization There are many different reasons for informal organization: Managerial organization theory often still regards informal organization as rather disturbing, but sometimes helpful. In the opinion of systems theory and cybernetics, however, formal organization fades into the background and only serves, if necessary, to supplement or to correct. Changes in structure always redevelop because of the conduct and differences among coworkers, and the ability of self-organization is recognized as a natural characteristic of a social system. References This article originated as a translation of the corresponding article in the German-language Wikipedia. Retrieved November 21, 2004. That article gave the following references: Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Anjana_(Cantabrian_mythology)] | [TOKENS: 752]
Contents Anjana (Cantabrian mythology) The Anjana (Cantabrian: (Western) [anˈhana], (Eastern)[an.xa.nɜ]; Spanish: [anˈxana]) (from jana, a former word for witches during the Middle Ages) are one of the best-known fairies of Cantabrian mythology. These female fairy creatures foil the cruel and ruthless Ojáncanu. In most stories, they are the good fairies of Cantabria, generous and protective of all people. Their depiction in the Cantabrian mythology is reminiscent of the lamias in ancient Greek mythology, as well as the xanas in Asturias, the janas in León, and the lamias in Basque Country, the latter without the zoomorphic appearance. Representatives of God and tree spirits Oral tradition provides different explanations for the nature of the Anjana. Some say they are heavenly beings sent by God to do good deeds, and they go back to heaven after 400 years, never to return. Others, however, indicate that they are spirits of trees who take care of the forests. Description Anjana are described as beautiful and delicate, 0.5 ft (0.15 m) tall, with white skin and a sweet voice. Some are like a nightingale when they are happy, and others are like a beetle stepping on leaves in autumn. Their eyes are slanted, serene and loving, with black or blue pupils as bright as the stars, and they feature nearly transparent wings. They wear long, jet black or golden braids, adorned with multicolored silk bows and ribbons; a beautiful crown of wild flowers on their head; and a blue cape on a long thin white tunic, and carry in their hands a stick of wicker or hawthorn which shines in a different color every day of the week. Association with forest trails, water, and gift-giving They are seen walking through the forest trails, resting on the banks of springs and on the margins of streams which then seem to come alive. They are able to talk with the water that flows from the sources and springs. They help injured animals and trees damaged by storms or Ojáncanu, lovers, people who lose their way in the forest, and the poor and suffering. Whenever they wander in villages, they leave gifts at the doors of helpful and kind people. When summoned for help they accept if the summoner is good of heart, but they also punish the wicked. Spring equinox Traditions state that at night during the spring equinox, they gather in the fells and dance until dawn holding hands and scattering roses. Anyone who manages to find a rose with purple, green, blue, or golden petals will be happy until the time of their death. Similar fairies in Cantabrian tradition Other Cantabrian-related fairies are the Hechiceras del Ebro (Enchantresses of the Ebro River), the Mozas del Agua (Water Lasses), the Viejuca de Vispieres (the Vispieres Little Old Woman), the Anjanas of Treceño, the Moras de Carmona (Moorish Maidens of Carmona) and the Ijanas del Valle de Aras (Ijanas of Aras [es] Valley). The Anjanas and Christmas Anjanas come to villages of the region during the night of January 5, with the intention of bringing children a variety of toys and gifts. This occurs every four years, generally to poor families, and still occurs annually in some areas of Cantabria. References Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/Lod#cite_ref-18] | [TOKENS: 4733]
Contents Lod Lod (Hebrew: לוד, fully vocalized: לֹד), also known as Lydda (Ancient Greek: Λύδδα) and Lidd (Arabic: اللِّدّ, romanized: al-Lidd, or اللُّدّ, al-Ludd), is a city 15 km (9+1⁄2 mi) southeast of Tel Aviv and 40 km (25 mi) northwest of Jerusalem in the Central District of Israel. It is situated between the lower Shephelah on the east and the coastal plain on the west. The city had a population of 90,814 in 2023. Lod has been inhabited since at least the Neolithic period. It is mentioned a few times in the Hebrew Bible and in the New Testament. Between the 5th century BCE and up until the late Roman period, it was a prominent center for Jewish scholarship and trade. Around 200 CE, the city became a Roman colony and was renamed Diospolis (Ancient Greek: Διόσπολις, lit. 'city of Zeus'). Tradition identifies Lod as the 4th century martyrdom site of Saint George; the Church of Saint George and Mosque of Al-Khadr located in the city is believed to have housed his remains. Following the Arab conquest of the Levant, Lod served as the capital of Jund Filastin; however, a few decades later, the seat of power was transferred to Ramla, and Lod slipped in importance. Under Crusader rule, the city was a Catholic diocese of the Latin Church and it remains a titular see to this day.[citation needed] Lod underwent a major change in its population in the mid-20th century. Exclusively Palestinian Arab in 1947, Lod was part of the area designated for an Arab state in the United Nations Partition Plan for Palestine; however, in July 1948, the city was occupied by the Israel Defense Forces, and most of its Arab inhabitants were expelled in the Palestinian expulsion from Lydda and Ramle. The city was largely resettled by Jewish immigrants, most of them expelled from Arab countries. Today, Lod is one of Israel's mixed cities, with an Arab population of 30%. Lod is one of Israel's major transportation hubs. The main international airport, Ben Gurion Airport, is located 8 km (5 miles) north of the city. The city is also a major railway and road junction. Religious references The Hebrew name Lod appears in the Hebrew Bible as a town of Benjamin, founded along with Ono by Shamed or Shamer (1 Chronicles 8:12; Ezra 2:33; Nehemiah 7:37; 11:35). In Ezra 2:33, it is mentioned as one of the cities whose inhabitants returned after the Babylonian captivity. Lod is not mentioned among the towns allocated to the tribe of Benjamin in Joshua 18:11–28. The name Lod derives from a tri-consonantal root not extant in Northwest Semitic, but only in Arabic (“to quarrel; withhold, hinder”). An Arabic etymology of such an ancient name is unlikely (the earliest attestation is from the Achaemenid period). In the New Testament, the town appears in its Greek form, Lydda, as the site of Peter's healing of Aeneas in Acts 9:32–38. The city is also mentioned in an Islamic hadith as the location of the battlefield where the false messiah (al-Masih ad-Dajjal) will be slain before the Day of Judgment. History The first occupation dates to the Neolithic in the Near East and is associated with the Lodian culture. Occupation continued in the Levant Chalcolithic. Pottery finds have dated the initial settlement in the area now occupied by the town to 5600–5250 BCE. In the Early Bronze, it was an important settlement in the central coastal plain between the Judean Shephelah and the Mediterranean coast, along Nahal Ayalon. Other important nearby sites were Tel Dalit, Tel Bareqet, Khirbat Abu Hamid (Shoham North), Tel Afeq, Azor and Jaffa. Two architectural phases belong to the late EB I in Area B. The first phase had a mudbrick wall, while the late phase included a circulat stone structure. Later excavations have produced an occupation later, Stratum IV. It consists of two phases, Stratum IVb with mudbrick wall on stone foundations and rounded exterior corners. In Stratum IVa there was a mudbrick wall with no stone foundations, with imported Egyptian potter and local pottery imitations. Another excavations revealed nine occupation strata. Strata VI-III belonged to Early Bronze IB. The material culture showed Egyptian imports in strata V and IV. Occupation continued into Early Bronze II with four strata (V-II). There was continuity in the material culture and indications of centralized urban planning. North to the tell were scattered MB II burials. The earliest written record is in a list of Canaanite towns drawn up by the Egyptian pharaoh Thutmose III at Karnak in 1465 BCE. From the fifth century BCE until the Roman period, the city was a centre of Jewish scholarship and commerce. According to British historian Martin Gilbert, during the Hasmonean period, Jonathan Maccabee and his brother, Simon Maccabaeus, enlarged the area under Jewish control, which included conquering the city. The Jewish community in Lod during the Mishnah and Talmud era is described in a significant number of sources, including information on its institutions, demographics, and way of life. The city reached its height as a Jewish center between the First Jewish-Roman War and the Bar Kokhba revolt, and again in the days of Judah ha-Nasi and the start of the Amoraim period. The city was then the site of numerous public institutions, including schools, study houses, and synagogues. In 43 BC, Cassius, the Roman governor of Syria, sold the inhabitants of Lod into slavery, but they were set free two years later by Mark Antony. During the First Jewish–Roman War, the Roman proconsul of Syria, Cestius Gallus, razed the town on his way to Jerusalem in Tishrei 66 CE. According to Josephus, "[he] found the city deserted, for the entire population had gone up to Jerusalem for the Feast of Tabernacles. He killed fifty people whom he found, burned the town and marched on". Lydda was occupied by Emperor Vespasian in 68 CE. In the period following the destruction of Jerusalem in 70 CE, Rabbi Tarfon, who appears in many Tannaitic and Jewish legal discussions, served as a rabbinic authority in Lod. During the Kitos War, 115–117 CE, the Roman army laid siege to Lod, where the rebel Jews had gathered under the leadership of Julian and Pappos. Torah study was outlawed by the Romans and pursued mostly in the underground. The distress became so great, the patriarch Rabban Gamaliel II, who was shut up there and died soon afterwards, permitted fasting on Ḥanukkah. Other rabbis disagreed with this ruling. Lydda was next taken and many of the Jews were executed; the "slain of Lydda" are often mentioned in words of reverential praise in the Talmud. In 200 CE, emperor Septimius Severus elevated the town to the status of a city, calling it Colonia Lucia Septimia Severa Diospolis. The name Diospolis ("City of Zeus") may have been bestowed earlier, possibly by Hadrian. At that point, most of its inhabitants were Christian. The earliest known bishop is Aëtius, a friend of Arius. During the following century (200-300CE), it's said that Joshua ben Levi founded a yeshiva in Lod. In December 415, the Council of Diospolis was held here to try Pelagius; he was acquitted. In the sixth century, the city was renamed Georgiopolis after St. George, a soldier in the guard of the emperor Diocletian, who was born there between 256 and 285 CE. The Church of Saint George and Mosque of Al-Khadr is named for him. The 6th-century Madaba map shows Lydda as an unwalled city with a cluster of buildings under a black inscription reading "Lod, also Lydea, also Diospolis". An isolated large building with a semicircular colonnaded plaza in front of it might represent the St George shrine. After the Muslim conquest of Palestine by Amr ibn al-'As in 636 CE, Lod which was referred to as "al-Ludd" in Arabic served as the capital of Jund Filastin ("Military District of Palaestina") before the seat of power was moved to nearby Ramla during the reign of the Umayyad Caliph Suleiman ibn Abd al-Malik in 715–716. The population of al-Ludd was relocated to Ramla, as well. With the relocation of its inhabitants and the construction of the White Mosque in Ramla, al-Ludd lost its importance and fell into decay. The city was visited by the local Arab geographer al-Muqaddasi in 985, when it was under the Fatimid Caliphate, and was noted for its Great Mosque which served the residents of al-Ludd, Ramla, and the nearby villages. He also wrote of the city's "wonderful church (of St. George) at the gate of which Christ will slay the Antichrist." The Crusaders occupied the city in 1099 and named it St Jorge de Lidde. It was briefly conquered by Saladin, but retaken by the Crusaders in 1191. For the English Crusaders, it was a place of great significance as the birthplace of Saint George. The Crusaders made it the seat of a Latin Church diocese, and it remains a titular see. It owed the service of 10 knights and 20 sergeants, and it had its own burgess court during this era. In 1226, Ayyubid Syrian geographer Yaqut al-Hamawi visited al-Ludd and stated it was part of the Jerusalem District during Ayyubid rule. Sultan Baybars brought Lydda again under Muslim control by 1267–8. According to Qalqashandi, Lydda was an administrative centre of a wilaya during the fourteenth and fifteenth century in the Mamluk empire. Mujir al-Din described it as a pleasant village with an active Friday mosque. During this time, Lydda was a station on the postal route between Cairo and Damascus. In 1517, Lydda was incorporated into the Ottoman Empire as part of the Damascus Eyalet, and in the 1550s, the revenues of Lydda were designated for the new waqf of Hasseki Sultan Imaret in Jerusalem, established by Hasseki Hurrem Sultan (Roxelana), the wife of Suleiman the Magnificent. By 1596 Lydda was a part of the nahiya ("subdistrict") of Ramla, which was under the administration of the liwa ("district") of Gaza. It had a population of 241 households and 14 bachelors who were all Muslims, and 233 households who were Christians. They paid a fixed tax-rate of 33,3 % on agricultural products, including wheat, barley, summer crops, vineyards, fruit trees, sesame, special product ("dawalib" =spinning wheels), goats and beehives, in addition to occasional revenues and market toll, a total of 45,000 Akçe. All of the revenue went to the Waqf. In 1051 AH/1641/2, the Bedouin tribe of al-Sawālima from around Jaffa attacked the villages of Subṭāra, Bayt Dajan, al-Sāfiriya, Jindās, Lydda and Yāzūr belonging to Waqf Haseki Sultan. The village appeared as Lydda, though misplaced, on the map of Pierre Jacotin compiled in 1799. Missionary William M. Thomson visited Lydda in the mid-19th century, describing it as a "flourishing village of some 2,000 inhabitants, imbosomed in noble orchards of olive, fig, pomegranate, mulberry, sycamore, and other trees, surrounded every way by a very fertile neighbourhood. The inhabitants are evidently industrious and thriving, and the whole country between this and Ramleh is fast being filled up with their flourishing orchards. Rarely have I beheld a rural scene more delightful than this presented in early harvest ... It must be seen, heard, and enjoyed to be appreciated." In 1869, the population of Ludd was given as: 55 Catholics, 1,940 "Greeks", 5 Protestants and 4,850 Muslims. In 1870, the Church of Saint George was rebuilt. In 1892, the first railway station in the entire region was established in the city. In the second half of the 19th century, Jewish merchants migrated to the city, but left after the 1921 Jaffa riots. In 1882, the Palestine Exploration Fund's Survey of Western Palestine described Lod as "A small town, standing among enclosure of prickly pear, and having fine olive groves around it, especially to the south. The minaret of the mosque is a very conspicuous object over the whole of the plain. The inhabitants are principally Moslim, though the place is the seat of a Greek bishop resident of Jerusalem. The Crusading church has lately been restored, and is used by the Greeks. Wells are found in the gardens...." From 1918, Lydda was under the administration of the British Mandate in Palestine, as per a League of Nations decree that followed the Great War. During the Second World War, the British set up supply posts in and around Lydda and its railway station, also building an airport that was renamed Ben Gurion Airport after the death of Israel's first prime minister in 1973. At the time of the 1922 census of Palestine, Lydda had a population of 8,103 inhabitants (7,166 Muslims, 926 Christians, and 11 Jews), the Christians were 921 Orthodox, 4 Roman Catholics and 1 Melkite. This had increased by the 1931 census to 11,250 (10,002 Muslims, 1,210 Christians, 28 Jews, and 10 Bahai), in a total of 2475 residential houses. In 1938, Lydda had a population of 12,750. In 1945, Lydda had a population of 16,780 (14,910 Muslims, 1,840 Christians, 20 Jews and 10 "other"). Until 1948, Lydda was an Arab town with a population of around 20,000—18,500 Muslims and 1,500 Christians. In 1947, the United Nations proposed dividing Mandatory Palestine into two states, one Jewish state and one Arab; Lydda was to form part of the proposed Arab state. In the ensuing war, Israel captured Arab towns outside the area the UN had allotted it, including Lydda. In December 1947, thirteen Jewish passengers in a seven-car convoy to Ben Shemen Youth Village were ambushed and murdered.In a separate incident, three Jewish youths, two men and a woman were captured, then raped and murdered in a neighbouring village. Their bodies were paraded in Lydda’s principal street. The Israel Defense Forces entered Lydda on 11 July 1948. The following day, under the impression that it was under attack, the 3rd Battalion was ordered to shoot anyone "seen on the streets". According to Israel, 250 Arabs were killed. Other estimates are higher: Arab historian Aref al Aref estimated 400, and Nimr al Khatib 1,700. In 1948, the population rose to 50,000 during the Nakba, as Arab refugees fleeing other areas made their way there. A key event was the Palestinian expulsion from Lydda and Ramle, with the expulsion of 50,000-70,000 Palestinians from Lydda and Ramle by the Israel Defense Forces. All but 700 to 1,056 were expelled by order of the Israeli high command, and forced to walk 17 km (10+1⁄2 mi) to the Jordanian Arab Legion lines. Estimates of those who died from exhaustion and dehydration vary from a handful to 355. The town was subsequently sacked by the Israeli army. Some scholars, including Ilan Pappé, characterize this as ethnic cleansing. The few hundred Arabs who remained in the city were soon outnumbered by the influx of Jews who immigrated to Lod from August 1948 onward, most of them from Arab countries. As a result, Lod became a predominantly Jewish town. After the establishment of the state, the biblical name Lod was readopted. The Jewish immigrants who settled Lod came in waves, first from Morocco and Tunisia, later from Ethiopia, and then from the former Soviet Union. Since 2008, many urban development projects have been undertaken to improve the image of the city. Upscale neighbourhoods have been built, among them Ganei Ya'ar and Ahisemah, expanding the city to the east. According to a 2010 report in the Economist, a three-meter-high wall was built between Jewish and Arab neighbourhoods and construction in Jewish areas was given priority over construction in Arab neighborhoods. The newspaper says that violent crime in the Arab sector revolves mainly around family feuds over turf and honour crimes. In 2010, the Lod Community Foundation organised an event for representatives of bicultural youth movements, volunteer aid organisations, educational start-ups, businessmen, sports organizations, and conservationists working on programmes to better the city. In the 2021 Israel–Palestine crisis, a state of emergency was declared in Lod after Arab rioting led to the death of an Israeli Jew. The Mayor of Lod, Yair Revivio, urged Prime Minister of Israel Benjamin Netanyahu to deploy Israel Border Police to restore order in the city. This was the first time since 1966 that Israel had declared this kind of emergency lockdown. International media noted that both Jewish and Palestinian mobs were active in Lod, but the "crackdown came for one side" only. Demographics In the 19th century and until the Lydda Death March, Lod was an exclusively Muslim-Christian town, with an estimated 6,850 inhabitants, of whom approximately 2,000 (29%) were Christian. According to the Israel Central Bureau of Statistics (CBS), the population of Lod in 2010 was 69,500 people. According to the 2019 census, the population of Lod was 77,223, of which 53,581 people, comprising 69.4% of the city's population, were classified as "Jews and Others", and 23,642 people, comprising 30.6% as "Arab". Education According to CBS, 38 schools and 13,188 pupils are in the city. They are spread out as 26 elementary schools and 8,325 elementary school pupils, and 13 high schools and 4,863 high school pupils. About 52.5% of 12th-grade pupils were entitled to a matriculation certificate in 2001.[citation needed] Economy The airport and related industries are a major source of employment for the residents of Lod. Other important factories in the city are the communication equipment company "Talard", "Cafe-Co" - a subsidiary of the Strauss Group and "Kashev" - the computer center of Bank Leumi. A Jewish Agency Absorption Centre is also located in Lod. According to CBS figures for 2000, 23,032 people were salaried workers and 1,405 were self-employed. The mean monthly wage for a salaried worker was NIS 4,754, a real change of 2.9% over the course of 2000. Salaried men had a mean monthly wage of NIS 5,821 (a real change of 1.4%) versus NIS 3,547 for women (a real change of 4.6%). The mean income for the self-employed was NIS 4,991. About 1,275 people were receiving unemployment benefits and 7,145 were receiving an income supplement. Art and culture In 2009-2010, Dor Guez held an exhibit, Georgeopolis, at the Petach Tikva art museum that focuses on Lod. Archaeology A well-preserved mosaic floor dating to the Roman period was excavated in 1996 as part of a salvage dig conducted on behalf of the Israel Antiquities Authority and the Municipality of Lod, prior to widening HeHalutz Street. According to Jacob Fisch, executive director of the Friends of the Israel Antiquities Authority, a worker at the construction site noticed the tail of a tiger and halted work. The mosaic was initially covered over with soil at the conclusion of the excavation for lack of funds to conserve and develop the site. The mosaic is now part of the Lod Mosaic Archaeological Center. The floor, with its colorful display of birds, fish, exotic animals and merchant ships, is believed to have been commissioned by a wealthy resident of the city for his private home. The Lod Community Archaeology Program, which operates in ten Lod schools, five Jewish and five Israeli Arab, combines archaeological studies with participation in digs in Lod. Sports The city's major football club, Hapoel Bnei Lod, plays in Liga Leumit (the second division). Its home is at the Lod Municipal Stadium. The club was formed by a merger of Bnei Lod and Rakevet Lod in the 1980s. Two other clubs in the city play in the regional leagues: Hapoel MS Ortodoxim Lod in Liga Bet and Maccabi Lod in Liga Gimel. Hapoel Lod played in the top division during the 1960s and 1980s, and won the State Cup in 1984. The club folded in 2002. A new club, Hapoel Maxim Lod (named after former mayor Maxim Levy) was established soon after, but folded in 2007. Notable people Twin towns-sister cities Lod is twinned with: See also References Bibliography External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/CNet] | [TOKENS: 3106]
Contents CNET CNET (short for Computer Network) is an American media website that publishes reviews, news, articles, blogs, podcasts, and videos on technology and consumer electronics globally. CNET originally produced content for radio and television in addition to its website before applying new media distribution methods through its internet television network, CNET Video, and its podcast and blog networks. Founded in 1992 by Halsey Minor and Shelby Bonnie, it was the flagship brand of CNET Networks and became a brand of CBS Interactive through that unit's acquisition of CNET Networks in 2008. Following acquisition by Red Ventures on October 30, 2020, the website faced criticism for the decline in quality of its editorial content and its factual unreliability due to the use of generative AI in the creation of its articles, as well as concerns over its journalistic integrity after it began increased publication of biased reviews and sponsored content to benefit its advertising partners. On October 1, 2024, CNET was acquired by Ziff Davis. History After leaving PepsiCo, Halsey Minor and Shelby Bonnie launched c/net, a 24-hour cable network about computers and technology in 1992. With help from Fox Network co-founder Kevin Wendle and former Disney creative associate Dan Baker, CNET produced four pilot television programs about computers, technology, and the Internet. CNET TV was composed of CNET Central, The Web, and The New Edge. CNET Central was created first and aired in syndication in the United States on the USA Network. Later, it began airing on USA's sister network Sci-Fi Channel along with The Web and The New Edge. These were later followed by TV.com in 1996. Media personality Ryan Seacrest first came to national prominence at CNET, as the host of The New Edge and doing various voice-over work for CNET. CNET online launched in June 1995. CNET, Inc., the site's owner, had its initial public offering (IPO) in July 1996, trading on the NASDAQ National Market as "CNWK". In 1998, CNET, Inc. was sued by Snap Technologies, operators of the education service CollegeEdge, for trademark infringement relating to CNET, Inc.'s ownership of the domain name Snap.com, due to Snap Technologies already owning a trademark on its name. CNET produced another television technology news program called News.com that aired on CNBC beginning in 1999. From 2001 to 2003, it operated CNET Radio on the Clear Channel-owned KNEW (910) in the San Francisco Bay Area, WBPS (890) in Boston, and XM Satellite Radio. CNET Radio offered technology-themed programming. After failing to attract a sufficient audience, CNET Radio ceased operating in January 2003 due to financial losses. In July 1999, CNET, Inc. acquired the Swiss-based company GDT, later renamed to CNET Channel. In 1998, CNET, Inc. granted the right to Asiacontent.com to set up CNET Asia and the operation was brought back in December 2000. In January 2000, the same time CNET, Inc. became CNET Networks, it acquired comparison shopping site mySimon for $736 million. In October 2000, CNET Networks acquired ZDNET for approximately $1.6 billion. In January 2001, Ziff Davis reached an agreement with CNET Networks to regain the URLs lost in the 2000 sale of Ziff Davis to SoftBank, a publicly traded Japanese media and technology company. In April 2001, CNET acquired TechRepublic, which provides content for IT professionals from Gartner, for $23 million in cash and stock. In May 2002, CNET Networks acquired Smartshop, an automated product catalog and feature comparison technology company, for an undisclosed amount. On July 14, 2004, CNET Networks announced that it would acquire photography website Webshots for $70 million ($60 million in cash, $10 million in deferred consideration), completing the acquisition that same month. In October 2007, it sold Webshots to American Greetings for $45 million. In August 2005, CNET Networks acquired Metacritic, a review aggregation website, for an undisclosed amount. In 2005, Google representatives refused to be interviewed by all CNET reporters for a year after CNET published Google's CEO Eric Schmidt's salary and named the neighborhood where he lives, as well as some of his hobbies and political donations. All the information had been gleaned from Google searches. In September 2006, CNET acquired Chowhound, an online food community. On October 10, 2006, Shelby Bonnie resigned as chairman and CEO, in addition to two other executives, as a result of a stock options backdating scandal that occurred between 1996 and 2003. This would also cause the firm to restate its financial earnings over 1996 to 2003 for over $105 million in resulting expenses. The Securities and Exchange Commission later dropped an investigation into the practice. Neil Ashe was named as the new CEO. In December 2006, James Kim, an editor at CNET, died in the Oregon wilderness. CNET hosted a memorial show and podcasts dedicated to him. On March 1, 2007, CNET announced the public launch of BNET, a website targeted towards business managers. BNET had been running under beta status since 2005. In 2008 programmer Chris Wanstrath, who worked on GameSpot and Chowhound, left CNET to start GitHub. On May 15, 2008, it was announced that CBS Corporation would buy CNET Networks for US$ 1.8 billion. On June 30, 2008, the acquisition was completed. Former CNET Networks properties were managed under CBS Interactive at the time. CBS Interactive acquired many domain names originally created by CNET Networks, including download.com, downloads.com, upload.com, news.com, search.com, TV.com, mp3.com, chat.com, computers.com, shopper.com, com.com, and cnet.com. It also held radio.com until CBS Radio was sold to Entercom in 2017. In 2011, CNET and CBS Interactive were sued by a coalition of artists (led by FilmOn founder Alki David) for copyright infringement by promoting the download of LimeWire, a popular peer to peer downloading software. Although the original suit was voluntarily dropped by Alki David, he vowed to sue at a later date to bring "expanded" action against CBS Interactive. In November 2011, another lawsuit against CBS Interactive was introduced, claiming that CNET and CBS Interactive knowingly distributed LimeWire. On September 19, 2013, CBS Interactive launched a Spanish language sister site under the name CNET en Español. It focuses on topics of relevance primarily to Spanish-speaking technology enthusiasts. The site offered a "new perspective" on technology and is under the leadership of managing editor Gabriel Sama. The site not only offered news and tutorials, but also had a robust reviews section that it was led by Juan Garzon. After Red Ventures' acquisition, the company announced the closing of CNET en Español on November 11, 2020, leaving the largest tech site in Spanish in the US out of the market. In March 2014, CNET refreshed its site by merging with CNET UK and vowing to merge all editions of the agency into a unified agency. This merge brought many changes, foremost of which would be a new user interface and the renaming of CNET TV as CNET Video. Red Ventures announced in September 2020 that it would acquire CNET from ViacomCBS for $500 million. The transaction was completed on October 30, 2020. In November 2022, CNET began publishing articles written with artificial intelligence and edited by humans. CNET was criticized for failing to disclose that it was using a machine to write articles, and for using human bylines on some AI-generated content until caught by independent investigators. CNET reviewed those articles in January 2023 after many were found to contain serious errors and plagiarized material. CNET reporters said Red Ventures pushed them to give more favourable coverage to advertisers and work on sponsored content. Subsequently, 10% of CNET staff were laid off. Employees unionized in response to the scandal and layoffs, saying AI-generated content posed a danger to their professional reputations. A former staffer demanded that her byline be removed from the site, in order to protect her reputation if her articles were revised by AI. In August 2023, CNET had deleted thousands of old articles from their website in an effort to raise the search engine optimization rankings on Google Search. Before an article is deleted on its website, CNET creates an internal copy and another to Wayback Machine. The writer, if still employed by CNET, is also alerted 10 days in advance. Google said deleting articles to optimize for search engine rankings is not a good practice. In January 2024, Axios reported that Red Ventures was exploring a sale of the website, with a goal of attaining at least $250 million for it. The site was profitable at the time. The approximate halving of CNET's value under Red Ventures' ownership is attributed to interest rates, a slower ad market, and the reputational damage to CNET caused by the AI scandals. On August 6, 2024, the New York Times reported that Red Ventures had reached an agreement to sell CNET to Ziff Davis for $100 million, subject to regulatory approval. The acquisition was completed in the third quarter of 2024. Websites France websites: Japan websites: CNET launched a website to cover video games, CNET Gamecenter, in the middle of 1996. According to the San Francisco Chronicle, it was "one of the first Web sites devoted to computer gaming news". It became a leading game-focused website; in 1999, PC Magazine named it one of the hundred-best websites in any field, alongside competitors IGN and GameSpot. According to Gamecenter head Michael Brown, the site received between 50,000 and 75,000 daily visitors by late 2000. In May 2000, CNET founded the Gamecenter Alliance network to bring Gamecenter and four partner websites, including Inside Mac Games, under one banner. Nielsen//NetRatings ranked Gamecenter the sixth-most-popular gaming website in the United States by mid-2000. On July 19, 2000, CNET, Inc. made public its plan to buy Ziff-Davis and its ZDNet Internet business for $1.6 billion. Because ZDNet had partnered with SpotMedia—parent company of GameSpot—in late 1996, the acquisition brought both GameSpot and Gamecenter under CNET, Inc.'s ownership. Later that year, The New York Times described the two publications as the "Time and Newsweek of gaming sites". The paper reported that Gamecenter "seem[ed] to be thriving" amid the dot-com crash, with its revenue distributed across online advertising and an affiliate sales program with CNET's Game Shopper website, launched in late 1999. Following an almost $400 million loss at CNET as a result of the dot-com crash, the company ended the Gamecenter Alliance network in January 2001. On February 7, Gamecenter itself was closed in a redundancy reduction effort, as GameSpot was the more successful of the two sites. Around 190 jobs were cut from CNET during this period, including "at least 20" at Gamecenter, according to the San Francisco Chronicle. Discussing the situation, Tom Bramwell of Eurogamer reported, "It is thought [...] that very few if any of the website's staff will move sideways into jobs at GameSpot, now the company's other gaming asset." The Washington Post later noted that Gamecenter was among the "popular video-game news sites" to close in 2001, alongside Daily Radar. Criticism In January 2013, CNET named Dish Network's "Hopper with Sling" digital video recorder as a nominee for the CES "Best in Show" award (which is decided by CNET on behalf of its organizers), and named it the winner in a vote by the site's staff. However, CBS abruptly disqualified the Hopper, and vetoed the results because the company was in active litigation with Dish Network. CNET also announced that it could no longer review any product or service provided by companies that CBS are in litigation with (which also includes Aereo). The new vote subsequently gave the Best in Show award to the Razer Edge tablet instead. Dish Network's CEO Joe Clayton said that the company was "saddened that CNET's staff is being denied its editorial independence because of CBS' heavy-handed tactics." On January 14, 2013, editor-in-chief Lindsey Turrentine addressed the situation, stating that CNET's staff were in an "impossible" situation due to the conflict of interest posed by the situation, and promised that she would do everything within her power to prevent a similar incident from occurring again. The conflict also prompted one CNET senior writer, Greg Sandoval, to resign. The decision also drew the ire of staff from the Consumer Electronics Association, the organizers of CES; CEO Gary J. Shapiro criticized the decision in a USA Today op-ed column and a statement by the CEA, stating that "making television easier to watch is not against the law. It is simply pro-innovation and pro-consumer." Shapiro felt that the decision also hurt the confidence of CNET's readers and staff, "destroying its reputation for editorial integrity in an attempt to eliminate a new market competitor." As a result of the controversy and fearing damage to the show's brand, the CEA announced on January 31, 2013, that CNET will no longer decide the CES Best in Show award winner due to the interference of CBS (the position has been offered to other technology publications), and the "Best in Show" award was jointly awarded to both the Hopper with Sling and Razer Edge. With a catalog of more than 400,000 titles, the Downloads section of the website allows users to download popular software. CNET's download.com provides Windows, Macintosh, and mobile software for download. CNET claims that this software is free of spyware, but independent sources have confirmed that this is not the case. While Download.com is overall a safe place to download programs, precautions should be taken before downloading from the site, as some downloads do contain malware. In January 2023, Wikipedia editors began the process of downgrading CNET's reliability rating as a source following the revelation that CNET was publishing content generated by artificial intelligence. In response to the decision, CNET claimed it maintained high editorial standards, stating, "It is important to clarify that CNET is not actively using AI to create new content. While we have no specific plans to restart, any future initiatives would follow our public AI policy." See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-:6-104] | [TOKENS: 11899]
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of".
========================================
[SOURCE: https://en.wikipedia.org/wiki/Gravitational_field] | [TOKENS: 1665]
Contents Gravitational field In physics, a gravitational field or gravitational acceleration field is a vector field used to explain the influences that a body extends into the space around itself. A gravitational field is used to explain gravitational phenomena, such as the gravitational force field exerted on another massive body. It has dimension of acceleration (L/T2) and it is measured in units of newtons per kilogram (N/kg) or, equivalently, in meters per second squared (m/s2). In its original concept, gravity was a force between point masses. Following Isaac Newton, Pierre-Simon Laplace attempted to model gravity as some kind of radiation field or fluid,[citation needed] and since the 19th century, explanations for gravity in classical mechanics have usually been taught in terms of a field model, rather than a point attraction. It results from the spatial gradient of the gravitational potential field. In general relativity, rather than two particles attracting each other, the particles distort spacetime via their mass, and this distortion is what is perceived and measured as a "force".[citation needed] In such a model one states that matter moves in certain ways in response to the curvature of spacetime, and that there is either no gravitational force, or that gravity is a fictitious force. Gravity is distinguished from other forces by its obedience to the equivalence principle. Classical mechanics In classical mechanics, a gravitational field is a physical quantity. A gravitational field can be defined using Newton's law of universal gravitation. Determined in this way, the gravitational field g around a single particle of mass M is a vector field consisting at every point of a vector pointing directly towards the particle. The magnitude of the field at every point is calculated by applying the universal law, and represents the force per unit mass on any object at that point in space. Because the force field is conservative, there is a scalar potential energy per unit mass, Φ, at each point in space associated with the force fields; this is called gravitational potential. The gravitational field equation is g = F m = d 2 R d t 2 = − G M R | R | 3 = − ∇ Φ , {\displaystyle \mathbf {g} ={\frac {\mathbf {F} }{m}}={\frac {d^{2}\mathbf {R} }{dt^{2}}}=-GM{\frac {\mathbf {R} }{\left|\mathbf {R} \right|^{3}}}=-\nabla \Phi ,} where F is the gravitational force, m is the mass of the test particle, R is the radial vector of the test particle relative to the mass (or for Newton's second law of motion which is a time dependent function, a set of positions of test particles each occupying a particular point in space for the start of testing), t is time, G is the gravitational constant, and ∇ is the del operator. This includes Newton's law of universal gravitation, and the relation between gravitational potential and field acceleration. ⁠d2R/dt2⁠ and ⁠F/m⁠ are both equal to the gravitational acceleration g (equivalent to the inertial acceleration, so same mathematical form, but also defined as gravitational force per unit mass). The negative signs are inserted since the force acts antiparallel to the displacement. The equivalent field equation in terms of mass density ρ of the attracting mass is: ∇ ⋅ g = − ∇ 2 Φ = − 4 π G ρ {\displaystyle \nabla \cdot \mathbf {g} =-\nabla ^{2}\Phi =-4\pi G\rho } which contains Gauss's law for gravity, and Poisson's equation for gravity. Newton's law implies Gauss's law, but not vice versa; see Relation between Gauss's and Newton's laws. These classical equations are differential equations of motion for a test particle in the presence of a gravitational field, i.e. setting up and solving these equations allows the motion of a test mass to be determined and described. The field around multiple particles is simply the vector sum of the fields around each individual particle. A test particle in such a field will experience a force that equals the vector sum of the forces that it would experience in these individual fields. This is g = ∑ i g i = 1 m ∑ i F i = − G ∑ i m i R − R i | R − R i | 3 = − ∑ i ∇ Φ i , {\displaystyle \mathbf {g} =\sum _{i}\mathbf {g} _{i}={\frac {1}{m}}\sum _{i}\mathbf {F} _{i}=-G\sum _{i}m_{i}{\frac {\mathbf {R} -\mathbf {R} _{i}}{\left|\mathbf {R} -\mathbf {R} _{i}\right|^{3}}}=-\sum _{i}\nabla \Phi _{i},} i.e. the gravitational field on mass mj is the sum of all gravitational fields due to all other masses mi, except the mass mj itself. Ri is the position vector of the gravitating particle i, and R is that of the test particle. General relativity A freely moving particle in gravitational field has the equations of motion: d 2 x λ d τ 2 + Γ μ ν λ d x μ d τ d x ν d τ = 0 {\displaystyle {\frac {d^{2}x^{\lambda }}{d\tau ^{2}}}+\Gamma _{\mu \nu }^{\lambda }{\frac {dx^{\mu }}{d\tau }}{\frac {dx^{\nu }}{d\tau }}=0} where τ {\displaystyle \tau } is the proper time for the particle, Γ μ ν λ {\displaystyle \Gamma _{\mu \nu }^{\lambda }} are the Christoffel symbols and repeated indices are summed over.: 70 The proper time can be expressed in terms of the metric tensor: d τ 2 = − g μ ν d x μ d x ν . {\displaystyle d\tau ^{2}=-g_{\mu \nu }dx^{\mu }dx^{\nu }.} The field that determines the gravitational force is the Christoffel symbols and its derivatives, the metric tensor plays the role of the gravitational potential.: 73 In general relativity, the gravitational field is determined by solving the Einstein field equations: 157 G = κ T , {\displaystyle \mathbf {G} =\kappa \mathbf {T} ,} where T is the stress–energy tensor, G is the Einstein tensor, and κ is the Einstein gravitational constant. The latter is defined as κ = 8πG/c4, where G is the Newtonian constant of gravitation and c is the speed of light. These equations are dependent on the distribution of matter, stress and momentum in a region of space, unlike Newtonian gravity, which is depends on only the distribution of matter. The fields themselves in general relativity represent the curvature of spacetime. General relativity states that being in a region of curved space is equivalent to accelerating up the gradient of the field. By Newton's second law, this will cause an object to experience a fictitious force if it is held still with respect to the field. This is why a person will feel himself pulled down by the force of gravity while standing still on the Earth's surface. In general the gravitational fields predicted by general relativity differ in their effects only slightly from those predicted by classical mechanics, but there are a number of easily verifiable differences, one of the most well known being the deflection of light in such fields. Embedding diagram Embedding diagrams are three dimensional graphs commonly used to educationally illustrate gravitational potential by drawing gravitational potential fields as a gravitational topography, depicting the potentials as so-called gravitational wells, sphere of influence. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Yellow_paint_debate] | [TOKENS: 600]
Contents Yellow paint debate In video game design, the yellow paint debate (also called the yellow paint discourse) is the ongoing discussion about yellow paint as an environmental directional marker in level design. Yellow paint has the benefit of being highly visible, but the downside that players may find it immersion-breaking or patronizing. Its use has at times drawn controversy. Description Bright colors are attention-grabbing, something encountered in nature through warning coloration and in everyday human life in the colors used for warning indicators, such as safety yellow. In video games, the game environment represents a simulacrum of some real space, in which player movement and environmental interaction are usually limited compared to real life. As players may miss realistic cues on where a path is, or which items will respond to interaction attempts, game developers may choose to use brightly colored indicators to draw the player's attention. The amount of detail in video game levels has gone up as the underlying technology improves, and this makes some historical solutions, such as unique art assets for climbing points, less obvious than they were in older games. If the path forward is sufficiently unintuitive - a subjective problem discovered either by playtesters or the community that plays the game after its release - it can leave players stuck, in some cases causing them to give up on the game entirely. The use of in-world markers guiding the player's direction, typified by (but not limited to) yellow paint, represents an attempt by the game developers to provide guidance without breaking the game's sense of immersion. The debate around yellow paint is over the extent to which this practice is necessary and effective.[citation needed] History IGN's Vikki Blake cites Uncharted as an early example of yellow paint guide markers in games. The Stanley Parable, a satirical game from 2013, features an extreme form of yellow paint guidance in which the player is instructed by the game's narrator to follow a yellow stripe on the floor. More recent games involved in the debate include the Resident Evil 4 remake and Final Fantasy VII Rebirth. Reception Games journalist Celia Wagar described yellow paint as "a cheap and easy way of" indicating interaction points, "but it probably represents an earlier failure in the art design of the game". PlaytestCloud wrote that "It's hard to say whether or not the use of yellow paint is the right solution, as we’d argue it all depends on the type of experience the developers are trying to create... Nevertheless, it's hard to disagree that the yellow paint can sometimes be intrusive, leading us to believe that game developers could devise an alternative." In a 2024 article, Kotaku summed up the yellow paint debate as unsolvable: "Get rid of it and some games become unplayable for folks. Keep it and people will make fun of it and complain. Add a toggle and then you have to build your levels and art in a way that can guide players without yellow paint for all the folks who turn it off." References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Aos_S%C3%AD] | [TOKENS: 1700]
Contents Aos Sí The aos sí (pronounced [iːsˠ ˈʃiː]; English approximation: /iːs ˈʃiː/ eess SHEE; older form: aes sídhe, áes sídhe [eːsˠ ˈʃiːə]) is a supernatural race in Irish folklore, similar to elves. They are said to descend from the Tuatha Dé Danann or the gods of Irish mythology. The name aos sí means "folk of the sí"; these are the burial mounds in which they are said to dwell, which are seen as portals to an Otherworld. Such abodes are referred to in English as 'shee', 'fairy mounds', 'elf mounds' or 'hollow hills'. The aos sí interact with humans and the human world. They are variously said to be the ancestors, the spirits of nature, or goddesses and gods. In modern Irish, they are also called daoine sidhe[a] (daoine sídhe) or daoine sí; in Scottish Gaelic daoine sìth ('folk of the fairy mounds'). Etymology In the Irish language, aos sí, earlier aes sídhe, means "folk of the fairy mounds". In Old Irish, it was áes síde. The word sí or sídh in Irish means a fairy mound or ancient burial mound, which were seen as portals to an Otherworld. It is derived from proto-Celtic *sīdos ('abode'), and is related to the English words 'seat' and 'settle'. David Fitzgerald conjectured that the word sídh was synonymous with "immortal" and is compared with words such as sídsat ("they wait/remain"), síthbeo ("lasting"), sídhbuan ("perpetual"), and sídhbe ("long life"). In most of the tales concerning the sí, a great age or long life is implied. In medieval literature In medieval Irish literature, the names aes síde and fír síde (folk of the síd) are equivalent to the terms Tuath Dé and Tuatha Dé Danann. The only difference is that Tuath Dé tends to be used in contexts of legendary history and mythology. Writing in the 7th century, the Irish bishop Tírechán described the sídh folk as "earthly gods" (Latin: dei terreni). The 8th century Fiacc's Hymn says that the Irish adored the sídh folk before the coming of Saint Patrick. In Irish folklore Due to the oral nature of Irish folklore, the exact origins of the fairies are not well defined. There are enough stories to support two possible origins. The fairies could either be fallen angels or the descendants of the Tuatha Dé Danann; in the latter case, this is equivalent with aos sí. In the former case, it is said that the fairies are angels who have fallen from heaven, but whose sins were not great enough to warrant hell. In many Gaelic tales, the aos sí are later literary versions of the Tuatha Dé Danann ("People of the Goddess Danu")—the deities and deified ancestors of Irish mythology. Some sources describe them as the survivors of the Tuatha Dé Danann who retreated into the Otherworld when fleeing the mortal Sons of Míl Espáine who, like many other early invaders of Ireland, came from Iberia. As part of the terms of their surrender to the Milesians, the Tuatha Dé Danann agreed to retreat and dwell underground. In folk belief and practice, the aos sí are often appeased with offerings and care is taken to avoid angering or insulting them. Often, they are not named directly, but rather spoken of as "The Good Neighbours", "The Fair Folk", or simply "The Folk". The most common names for them, aos sí, aes sídhe, daoine sídhe (singular duine sídhe) and daoine sìth mean, literally, "people of the mounds" (referring to the sídhe). The aos sí are generally described as stunningly beautiful, though they can also be terrible and hideous. Aos sí are seen as fierce guardians of their abodes—whether a fairy hill, a fairy ring, a special tree (often a whitethorn) or a particular loch or wood. It is believed that infringing on these spaces will cause the aos sí to retaliate in an effort to remove the people or objects that invaded their homes. Many of these tales contribute to the changeling myth in west European folklore, with the aos sí kidnapping trespassers or replacing their children with changelings as a punishment for transgressing. The aos sí are often connected to certain times of year and hours; as the Gaelic Otherworld is believed to come closer to the mortal world at the times of dusk and dawn, the aos sí correspondingly become easier to encounter. Some festivals such as Samhain, Bealtaine and Midsummer are also associated with the aos sí. A sídh (anglicized 'shee') is burial mound (tumuli) associated with the aos sí. In modern Irish, the word is sí (plural síthe); in Scottish Gaelic it is sìth (plural sìthean); in Old Irish it is síd (plural síde). These sídhe are referred to in English as 'fairy mounds', 'elf mounds' or 'hollow hills'. In some later English-language texts, the word sídhe is incorrectly used both for the mounds and the people of the mounds. For example, W. B. Yeats, writing in 1908, referred to the aos sí simply as "the sídhe". However, sidh in older texts refers specifically to "the palaces, courts, halls or residences" of the otherworldly beings that supposedly inhabit them. The Aos sí are known by many names in Ireland, among them: Types The banshee or bean sídhe (from Old Irish: ban síde), which means "woman of the sídhe", has come to indicate any supernatural woman of Ireland who announces a coming death by wailing and keening. Her counterpart in Scottish mythology is the bean sìth (sometimes spelled bean-sìdh). Other varieties of aos sí and daoine sìth include the Scottish bean-nighe (the washerwoman who is seen washing the bloody clothing or armour of the person who is doomed to die), the leanan sídhe (the "fairy lover"), the cat-sìth (a fairy cat), and the cù-sìth (a fairy dog). The sluagh sídhe—"the fairy host"—is sometimes depicted in Irish and Scottish lore as a crowd of airborne spirits, perhaps the cursed, evil or restless dead. The siabhra (anglicised as "sheevra"), may be a type of these lesser spirits, prone to evil and mischief. However, an Ulster folk song also uses "sheevra" simply to mean "spirit" or "fairy". Creideamh Sí Creideamh Sí is Irish for the "Fairy Faith", a term for the collection of beliefs and practices observed by those who wish to keep good relationships with the aos sí and avoid angering them. General belief in the Celtic otherworld, the existence of aos sí and the ability of the aos sí to influence the local area and its people are all beliefs characteristic of the Creideamh Sí. It is characterised as an aspect of Irish popular religion and exists syncretically with folk Christianity. Effort is made by those who believe to appease local aos sí through food and drink. The custom of offering milk and traditional foods—such as baked goods, apples or berries—to the aos sí has survived through the Christian era into the present day in parts of Ireland, Scotland and the diaspora. Those who maintain some degree of belief in the aos sí also are careful to leave their sacred places alone and protect them from damage through road or housing construction. See also Explanatory notes References
========================================