text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-248] | [TOKENS: 10515]
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Pillywiggin] | [TOKENS: 863]
Contents Pillywiggin Pillywiggins are tiny goblins and fairies, guardians of the flora, mentioned in English and Irish folklore. Tiny in size, they have the antennae and wings of a butterfly or dragonfly, live in groups and spend their time frolicking among the flowers. They are described by Nancy Arrowsmith, and later by Pierre Dubois and others in The Great Encyclopedia of Fairies and Lessons in Elficology. Origin Pillywiggins are fairies from English folklore, associated with spring flowers and personifying the "divine essence of plants". They are mentioned in the folklore of Great Britain and Ireland. Pierre Dubois cites the alvens of Holland and certain fairies on the border of the Belgian Ardennes, who play similar roles. The name "Pillywiggin" appeared in 1977 in the American Nancy Arrowsmith's Field Guide to the Little People, who believes that the name of these creatures comes from the English county of Dorset. It is also found in a collection by American folklorist Tristram Potter Coffin, dated 1984. Pillywiggins are also mentioned in the esoteric work of Faery Wicca author Edain McCoy (1994), who classifies them among the elemental fairies, citing their preference for the shade of great oaks (a characteristic also present in Bane's description), and describing a very seductive pillywiggins queen, who goes by the name of Ariel and rides bats. Description Author Catherine Rager (2003) describes them as pixies, while Theresa Bane associates them with fairies. Winged, they usually measure a centimetre, but can change size. Their food consists of dew and pollen. They are trooping fairies, creatures that live in groups. They have no particular interest in human beings, but may participate in some of their activities, such as wedding ceremonies and other celebrations. Unlike other fairies in British folklore, they are not known for playing tricks on humans. According to Pierre Dubois, they are the tiniest of the elven gentry, along with Lincolnshire's Tiddy. "Wonderfully beautiful" thanks to their butterfly-like attributes, they are fond of English parks and gardens, in all parts of the UK except the Midlands, as well as in Ireland. There, they spend their time playing and frolicking. They are the guardian spirits of small flora, living to the rhythm of the plants they protect. They hibernate from November to April, until the cuckoo chirps. They are said to ignore humans, preferring to dance among the wildflowers in the shade of tall oaks, where they are usually found. Their popular representations show them riding bees from flower to flower, or themselves the size of a bee. According to gardening specialists Karan Davis Cutler and Barbara W. Ellis, English folklore mostly associates pillywiggins with the tulip. Mentions in fiction and video games Pillywiggins gave their name to Julia Jarman's children's novel Pilliwiggins and the Tree Witch. In Alexander of Teagos, Paula Porter describes pillywiggins as beings that are "silent, but speak to your heart". They can also be found in fantasy novels, such as Rebecca Paisley's A Basket of Wishes, Brian Cullen's Seekers of the Chalice, Tiffany Trent's By Venom's Sweet Sting, Tiffany Turner's The Lost Secret of the Green Man, which describes them as guardians of wildflowers, and other works of fiction. The yellow pillywiggin and the red pillywiggin are notable enemies of the Final Fantasy XI game, akin to bees. In his children's book Leçons d'elficologie, Pierre Dubois presents a plate depicting the metamorphosis of a young pillywiggin into a butterfly fairy. A nursery rhyme published in an Australian children's book describes singing Pillywiggin. A modern Italian storybook evokes the proximity of Pillywiggins to foxglove and bellflower. Multicolored Pillywiggin is the title of a children's song on Pakita's album Viens vite... Je t'invite, released in 2007. References Bibliography
========================================
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-nasa20170929-148] | [TOKENS: 11899]
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of".
========================================
[SOURCE: https://en.wikipedia.org/wiki/Weak_artificial_intelligence#cite_ref-10] | [TOKENS: 594]
Contents Weak artificial intelligence Weak artificial intelligence (weak AI) is artificial intelligence that implements a limited part of the mind, or, as narrow AI, artificial narrow intelligence (ANI), is focused on one narrow task. Weak AI is contrasted with strong AI, which can be interpreted in various ways: Narrow AI can be classified as being "limited to a single, narrowly defined task. Most modern AI systems would be classified in this category." Artificial general intelligence is conversely the opposite. Applications and risks Some examples of narrow AI are AlphaGo, self-driving cars, robot systems used in the medical field, and diagnostic doctors. Narrow AI systems are sometimes dangerous if unreliable. And the behavior that it follows can become inconsistent. It could be difficult for the AI to grasp complex patterns and get to a solution that works reliably in various environments. This "brittleness" can cause it to fail in unpredictable ways. Narrow AI failures can sometimes have significant consequences. It could for example cause disruptions in the electric grid, damage nuclear power plants, cause global economic problems, and misdirect autonomous vehicles. Medicines could be incorrectly sorted and distributed. Also, medical diagnoses can ultimately have serious and sometimes deadly consequences if the AI is faulty or biased. Simple AI programs have already worked their way into society, oftentimes unnoticed by the public. Autocorrection for typing, speech recognition for speech-to-text programs, and vast expansions in the data science fields are examples. Narrow AI has also been the subject of some controversy, including resulting in unfair prison sentences, discrimination against women in the workplace for hiring, resulting in death via autonomous driving, among other cases. Despite being "narrow" AI, recommender systems are efficient at predicting user reactions based their posts, patterns, or trends. For instance, TikTok's "For You" algorithm can determine a user's interests or preferences in less than an hour. Some other social media AI systems are used to detect bots that may be involved in propaganda or other potentially malicious activities. Weak AI versus strong AI John Searle contests the possibility of strong AI (by which he means conscious AI). He further believes that the Turing test (created by Alan Turing and originally called the "imitation game", used to assess whether a machine can converse indistinguishably from a human) is not accurate or appropriate for testing whether an AI is "strong". Scholars such as Antonio Lieto have argued that the current research on both AI and cognitive modelling are perfectly aligned with the weak-AI hypothesis (that should not be confused with the "general" vs "narrow" AI distinction) and that the popular assumption that cognitively inspired AI systems espouse the strong AI hypothesis is ill-posed and problematic since "artificial models of brain and mind can be used to understand mental phenomena without pretending that that they are the real phenomena that they are modelling" (as, on the other hand, implied by the strong AI assumption). See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Phylogenetic_tree] | [TOKENS: 2394]
Contents Phylogenetic tree A phylogenetic tree or phylogeny is a graphical representation which shows the evolutionary history between a set of species or taxa during a specific time. In other words, it is a branching diagram or a tree showing the evolutionary relationships among various biological species or other entities based upon similarities and differences in their physical or genetic characteristics. In evolutionary biology, all life on Earth is theoretically part of a single phylogenetic tree, indicating common ancestry. Phylogenetics is the study of phylogenetic trees. The main challenge is to find a phylogenetic tree representing optimal evolutionary ancestry between a set of species or taxa. Computational phylogenetics (also phylogeny inference) focuses on the algorithms involved in finding optimal phylogenetic tree in the phylogenetic landscape. Phylogenetic trees may be rooted or unrooted. In a rooted phylogenetic tree, each node with descendants represents the inferred most recent common ancestor of those descendants, and the edge lengths in some trees may be interpreted as time estimates. Each node is called a taxonomic unit. Internal nodes are generally called hypothetical taxonomic units, as they cannot be directly observed. Trees are useful in fields of biology such as bioinformatics, systematics, and phylogenetics. Unrooted trees illustrate only the relatedness of the leaf nodes and do not require the ancestral root to be known or inferred. History The idea of a tree of life arose from ancient notions of a ladder-like progression from lower into higher forms of life (such as in the Great Chain of Being). Early representations of "branching" phylogenetic trees include a "paleontological chart" showing the geological relationships among plants and animals in the book Elementary Geology, by Edward Hitchcock (first edition: 1840). Charles Darwin featured a diagrammatic evolutionary "tree" in his 1859 book On the Origin of Species. Over a century later, evolutionary biologists still use tree diagrams to depict evolution because such diagrams effectively convey the concept that speciation occurs through the adaptive and semirandom splitting of lineages. The term phylogenetic, or phylogeny, derives from the two ancient greek words φῦλον (phûlon), meaning "race, lineage", and γένεσις (génesis), meaning "origin, source". Properties A rooted phylogenetic tree (see two graphics at top) is a directed tree with a unique node — the root — corresponding to the (usually imputed) most recent common ancestor of all the entities at the leaves of the tree. The root node does not have a parent node, but serves as the parent of all other nodes in the tree. The root is therefore a node of degree 2, while other internal nodes have a minimum degree of 3 (where "degree" here refers to the total number of incoming and outgoing edges).[citation needed] The most common method for rooting trees is the use of an uncontroversial outgroup—close enough to allow inference from trait data or molecular sequencing, but far enough to be a clear outgroup. Another method is midpoint rooting, or a tree can also be rooted by using a non-stationary substitution model. Unrooted trees illustrate the relatedness of the leaf nodes without making assumptions about ancestry. They do not require the ancestral root to be known or inferred. Rooted trees can be generated from unrooted ones by inserting a root. Inferring the root of an unrooted tree requires some means of identifying ancestry. This is normally done by including an outgroup in the input data so that the root is necessarily between the outgroup and the rest of the taxa in the tree, or by introducing additional assumptions about the relative rates of evolution on each branch, such as an application of the molecular clock hypothesis. Both rooted and unrooted trees can be either bifurcating or multifurcating. A rooted bifurcating tree has exactly two descendants arising from each interior node (that is, it forms a binary tree), and an unrooted bifurcating tree takes the form of an unrooted binary tree, a free tree with exactly three neighbors at each internal node. In contrast, a rooted multifurcating tree may have more than two children at some nodes and an unrooted multifurcating tree may have more than three neighbors at some nodes.[citation needed] Both rooted and unrooted trees can be either labeled or unlabeled. A labeled tree has specific values assigned to its leaves, while an unlabeled tree, sometimes called a tree shape, defines a topology only. Some sequence-based trees built from a small genomic locus, such as Phylotree, feature internal nodes labeled with inferred ancestral haplotypes. The number of possible trees for a given number of leaf nodes depends on the specific type of tree, but there are always more labeled than unlabeled trees, more multifurcating than bifurcating trees, and more rooted than unrooted trees. The last distinction is the most biologically relevant; it arises because there are many places on an unrooted tree to put the root. For bifurcating labeled trees, the total number of rooted trees is: For bifurcating labeled trees, the total number of unrooted trees is: Among labeled bifurcating trees, the number of unrooted trees with n {\displaystyle n} leaves is equal to the number of rooted trees with n − 1 {\displaystyle n-1} leaves. The number of rooted trees grows quickly as a function of the number of tips. For 10 tips, there are more than 34 × 10 6 {\displaystyle 34\times 10^{6}} possible bifurcating trees, and the number of multifurcating trees rises faster, with ca. 7 times as many of the latter as of the former. Special tree types A dendrogram is a general name for a tree, whether phylogenetic or not, and hence also for the diagrammatic representation of a phylogenetic tree. A cladogram only represents a branching pattern; i.e., its branch lengths do not represent time or relative amount of character change, and its internal nodes do not represent ancestors. A phylogram is a phylogenetic tree that has branch lengths proportional to the amount of character change. A chronogram is a phylogenetic tree that explicitly represents time through its branch lengths. A Dahlgrenogram is a diagram representing a cross section of a phylogenetic tree.[citation needed] A phylogenetic network is not strictly speaking a tree, but rather a more general graph, or a directed acyclic graph in the case of rooted networks. They are used to overcome some of the limitations inherent to trees. A spindle diagram, or bubble diagram, is often called a romerogram, after its popularisation by the American palaeontologist Alfred Romer. It represents taxonomic diversity (horizontal width) against geological time (vertical axis) in order to reflect the variation of abundance of various taxa through time. A spindle diagram is not an evolutionary tree: the taxonomic spindles obscure the actual relationships of the parent taxon to the daughter taxon and have the disadvantage of involving the paraphyly of the parental group. This type of diagram is no longer used in the form originally proposed. Darwin also mentioned that the coral may be a more suitable metaphor than the tree. Indeed, phylogenetic corals are useful for portraying past and present life, and they have some advantages over trees (anastomoses allowed, etc.). Construction Phylogenetic trees composed with a nontrivial number of input sequences are constructed using computational phylogenetics methods. Distance-matrix methods such as neighbor-joining or UPGMA, which calculate genetic distance from multiple sequence alignments, are simplest to implement, but do not invoke an evolutionary model. Many sequence alignment methods such as ClustalW also create trees by using the simpler algorithms (i.e. those based on distance) of tree construction. Maximum parsimony is another simple method of estimating phylogenetic trees, but implies an implicit model of evolution (i.e. parsimony). More advanced methods use the optimality criterion of maximum likelihood, often within a Bayesian framework, and apply an explicit model of evolution to phylogenetic tree estimation. Identifying the optimal tree using many of these techniques is NP-hard, so heuristic search and optimization methods are used in combination with tree-scoring functions to identify a reasonably good tree that fits the data. Tree-building methods can be assessed on the basis of several criteria: Tree-building techniques have also gained the attention of mathematicians. Trees can also be built using T-theory. Trees can be encoded in a number of different formats, all of which must represent the nested structure of a tree. They may or may not encode branch lengths and other features. Standardized formats are critical for distributing and sharing trees without relying on graphics output that is hard to import into existing software. Commonly used formats are Limitations of phylogenetic analysis Although phylogenetic trees produced on the basis of sequenced genes or genomic data in different species can provide evolutionary insight, these analyses have important limitations. Most importantly, the trees that they generate are not necessarily correct – they do not necessarily accurately represent the evolutionary history of the included taxa. As with any scientific result, they are subject to falsification by further study (e.g., gathering of additional data, analyzing the existing data with improved methods). The data on which they are based may be noisy; the analysis can be confounded by genetic recombination, horizontal gene transfer, hybridisation between species that were not nearest neighbors on the tree before hybridisation takes place, and conserved sequences. Also, there are problems in basing an analysis on a single type of character, such as a single gene or protein or only on morphological analysis, because such trees constructed from another unrelated data source often differ from the first, and therefore great care is needed in inferring phylogenetic relationships among species. This is most true of genetic material that is subject to lateral gene transfer and recombination, where different haplotype blocks can have different histories. In these types of analysis, the output tree of a phylogenetic analysis of a single gene is an estimate of the gene's phylogeny (i.e. a gene tree) and not the phylogeny of the taxa (i.e. species tree) from which these characters were sampled, though ideally, both should be very close. For this reason, serious phylogenetic studies generally use a combination of genes that come from different genomic sources (e.g., from mitochondrial or plastid vs. nuclear genomes), or genes that would be expected to evolve under different selective regimes, so that homoplasy (false homology) would be unlikely to result from natural selection. When extinct species are included as terminal nodes in an analysis (rather than, for example, to constrain internal nodes), they are considered not to represent direct ancestors of any extant species. Extinct species do not typically contain high-quality DNA. The range of useful DNA materials has expanded with advances in extraction and sequencing technologies. Development of technologies able to infer sequences from smaller fragments, or from spatial patterns of DNA degradation products, would further expand the range of DNA considered useful. Phylogenetic trees can also be inferred from a range of other data types, including morphology, the presence or absence of particular types of genes, insertion and deletion events – and any other observation thought to contain an evolutionary signal. Phylogenetic networks are used when bifurcating trees are not suitable, due to these complications which suggest a more reticulate evolutionary history of the organisms sampled. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-142] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Maor_Farid#cite_note-18] | [TOKENS: 1458]
Contents Maor Farid Dr. Maor Farid (Hebrew: מאור פריד; born April 20, 1992) is an Israeli scientist, engineer and artificial intelligence researcher at Massachusetts Institute of Technology, social activist, and author. He is the founder and CEO of Learn to Succeed (Hebrew: ללמוד להצליח) for empowering of youths from the Israeli socio-economic periphery and youths at risk, a regional manager of the Israeli center of ScienceAbroad at MIT, and an activist in the American Technion Society. He is an alumnus of Unit 8200, and a fellow of Fulbright Program and the Israel Scholarship Educational Foundation [he]. Dr. Farid was elected to the Forbes 30 Under 30 list of 2019, and won the Moskowitz Prize for Zionism. Early life Maor was born in Ness Ziona, a city in central Israel, as the eldest son for parents from immigrating families of Mizrahi Jews from Iraq and Libya. Maor suffered from Attention deficit hyperactivity disorder (ADHD) from a young age, and was classified as a problematic and violent student. His ADHD issues were diagnosed only after he began his university studies. However, inspired by his parents' background, he aspired to excel at school for a better future for his family. During elementary school, Maor attended local quizzes about Jewish history and Zionism, which significantly shaped his identity and national perspective. Farid graduated high school with the highest GPA in school. Later he was recruited to the Israel Defense Forces and drafted to the Brakim Program [he] – an excellence program of the Israeli Intelligence Corps for training leading R&D officers for the Israeli military and defense industry. Maor graduated the program with honors and was elected by the Israeli Prime Minister's Office and Unit 8200, where he served as an artificial intelligence researcher, officer, and commander. During his Military service, he received various honors and awards, such as the Excellent Scientist Award, given to the top three academics serving in the Israel Defense Forces. In 2019, Farid completed his military service in the rank of a Captain. Education and academic career As part of the (4 years) Brakim Program, Maor completed his Bachelor's and Master's degrees at the Technion in Mechanical Engineering with honors. Then, he initiated his Ph.D. research as a collaboration with the Israel Atomic Energy Commission (IAEC) in parallel to his duty military service. The main goals of his Ph.D. research were predicting irreversible effects of major earthquakes on Israel's nuclear facilities, and improving their seismic resistance using energy absorption technologies. The mathematical models developed by Farid were able to forecast earthquake effects on facilities with major hazard potential, and predicted the failure of liquid storage tanks due to earthquakes took place in Italy (2012) and Mexico (2017). The energy absorption technologies used, increased in up to 90% the seismic resistance abilities of those sensitive facilities. The research results were published in multiple papers in peer-reviewed academic journals and presented in international academic conferences. Later, this research expanded to an official collaboration between the Technion and the Shimon Peres Negev Nuclear Research Center, which aims to implement the findings obtained on existing sensitive systems, and won funding of 1.5 million NIS from the Pazy foundation of the Israel Atomic Energy Commission and the Council for Higher Education. In 2017, Farid completed his Ph.D. and as the youngest graduate at the Technion for that year, at the age of 24. In the graduation ceremonies, he honored his parents to receive the diplomas on his behalf. At the same year, he served as a lecturer at Ben-Gurion University in an original course he developed as a solution for knowledge gaps he identified in the Israeli defense industry. In 2018, Dr. Farid served as an artificial intelligence researcher at a Data Science team of Unit 8200, where he developed machine learning-based solutions for military and operational needs. In 2019, Farid won the Fulbright and the Israel Scholarship Educational Foundation scholarships, and was accepted to post-doctoral position at Massachusetts Institute of Technology where he develops real-time methods for predicting earthquake effects using machine learning techniques. In 2020, Farid was accepted to the Emerging Leaders Program at Harvard Kennedy School in Cambridge, Massachusetts. At the same year, he received the excellence research grant of the Israel Academy of Sciences and Humanities for leading his research in collaboration between MIT and the Technion. Social activism Farid social activism focuses on empowering youths from disadvantaged backgrounds from an early age. In 2010–2015, he served as a mentor of a robotics team from Dimona in FIRST Robotics Competition, a mathematics tutor in "Aharai!" [he] program for high-school students at risk in Dimona and Be'er Sheva, and a mentor and private tutor of adolescence and reserve duty soldiers from disadvantaged backgrounds. In 2010, he initiated "Learn to Succeed" (Hebrew: ללמוד להצליח) project, for mitigating the social gaps in the Israeli society by empowering youths from the social, economical, and geographical periphery for excellence, self-fulfillment and gaining formal education. In 2018, Learn to Succeed became an official non-profit organization. At the same year, Farid led a crowdfunding project of 150,000 NIS in order to expand the organization to a national scale. In 2019, he published the book "Learn to Succeed", in which he describes his struggle with ADHD, the violent environment in which he grew up, and the changing process he went through from being a violent teenager to becoming the youngest Ph.D. graduate at the Technion. The book was given to more than two thousand youths at risk and became a top seller in Israel shortly after its publication. Maor dedicated the book to his parents and to the memorial of his friend Captain Tal Nachman who was killed in operational activity during his military service in 2014. The organization consists of hundreds of volunteers, gives full scholarships to STEM students from the periphery who serve as mentors of youths, both Jews and Arabs, from disadvantaged backgrounds, runs a hotline which gives online practical and mental support to hundreds of youths, parents and educators, initiates inspirational activities with military orientation to increase the motivation of its teen-age members for significant military service, and gives inspirational lectures to more than 5,000 youths each year. In 2019, Maor initiated a collaboration with Unit 8200 in which tens of the program's members are being interviewed to the unit. This opportunity is usually given to students with the highest grades in the matriculate exams in each class. In 2020, Dr. Farid established the ScienceAbroad center at MIT, aiming to strengthen the connections between Israeli researchers in the institute and the state of Israel. Moreover, he serves as a volunteer in the American Technion Society. Honors and awards Personal life Farid is married to Michal. Interviews and articles References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Harvard_Mark_I] | [TOKENS: 2140]
Contents Harvard Mark I The Harvard Mark I, or IBM Automatic Sequence Controlled Calculator (ASCC), was one of the earliest general-purpose electromechanical computers used in the war effort during the last part of World War II. One of the first programs to run on the Mark I was initiated on 29 March 1944 by John von Neumann. At that time, von Neumann was working on the Manhattan Project, and needed to determine whether implosion was a viable choice to detonate the atomic bomb that would be used a year later. The Mark I also computed and printed mathematical tables, which had been the initial goal of British inventor Charles Babbage for his analytical engine in 1837. According to Edmund Berkeley, the operators of the Mark I often called the machine "Bessy, the Bessel engine", after Bessel functions. The Mark I was disassembled in 1959; part of it was given to IBM, part went to the Smithsonian Institution, and part entered the Harvard Collection of Historical Scientific Instruments. For decades, Harvard's portion was on display in the lobby of the Aiken Computation Lab. About 1997, it was moved to the Harvard Science Center. In 2021, it was moved again, to the lobby of Harvard's new Science and Engineering Complex in Allston, Massachusetts. Origins The origins of the Harvard Mark I can be traced back to 1935, when Howard Aiken conceived of building a powerful, large-scale calculating machine while pursuing graduate studies in physics at Harvard University. Aiken presented his proposal for an automatic calculating machine to IBM in November 1937. The concept was intended to solve advanced mathematical problems. After a feasibility study by IBM engineers, the company chairman Thomas Watson Sr. personally approved the project and its funding in February 1939. Howard Aiken had started to look for a company to design and build his calculator in early 1937. After two rejections, he was shown a demonstration set that Charles Babbage’s son had given to Harvard University 70 years earlier. This led him to study Babbage and to add references to the Analytical Engine to his proposal; the resulting machine "brought Babbage’s principles of the Analytical Engine almost to full realization, while adding important new features." In 1941, Aiken had to put the project on hiatus, as he was called into active naval service in World War II. The ASCC was developed and built by IBM at their Endicott plant in early 1943, and it was shipped to Harvard in February 1944. It began computations for the US Navy Bureau of Ships in May and was officially presented to the university on August 7, 1944. Although not the first working computer, the machine was the first to automate the execution of complex calculations, making it a significant step forward for computing. Design and construction The ASCC was built from switches, relays, rotating shafts, and clutches. It used 765,000 electromechanical components and hundreds of miles of wire, comprising a volume of 816 cubic feet (23 m3) – 51 feet (16 m) in length, 8 feet (2.4 m) in height, and 2 feet (0.61 m) deep. It weighed about 9,445 pounds (4.7 short tons; 4.3 t). The basic calculating units had to be synchronized and powered mechanically, so they were operated by a 50-foot (15 m) drive shaft coupled to a 5 horsepower (3.7 kW) electric motor, which served as the main power source and system clock. From the IBM Archives: The Automatic Sequence Controlled Calculator (Harvard Mark I) was the first operating machine that could execute long computations automatically. A project conceived by Harvard University’s Dr. Howard Aiken, the Mark I was built by IBM engineers in Endicott, N.Y. A steel frame 51 feet long and 8 feet high held the calculator, which consisted of an interlocking panel of small gears, counters, switches and control circuits, all only a few inches in depth. The ASCC used 500 miles (800 km) of wire with three million connections, 3,500 multipole relays with 35,000 contacts, 2,225 counters, 1,464 tenpole switches and tiers of 72 adding machines, each with 23 significant numbers. It was the industry’s largest electromechanical calculator. The enclosure for the Mark I was designed by futuristic American industrial designer Norman Bel Geddes at IBM's expense. Aiken was annoyed that the cost ($50,000 or more according to Grace Hopper) was not used to build additional computer equipment. Operation The Mark I had 60 sets of 24 switches for manual data entry and could store 72 numbers, each 23 decimal digits long. It could do 3 additions or subtractions in a second. A multiplication took 6 seconds, a division took 15.3 seconds, and a logarithm or a trigonometric function took over one minute. The Mark I read its instructions from a 24-channel punched paper tape. It executed the current instruction and then read the next one. A separate tape could contain numbers for input, but the tape formats were not interchangeable. Instructions could not be executed from the storage registers. Because instructions were not stored in working memory, it is widely claimed that the Harvard Mark I was the origin of the Harvard architecture. However, this is disputed in The Myth of the Harvard Architecture published in the IEEE Annals of the History of Computing, which shows the term 'Harvard architecture' did not come into use until the 1970s (in the context of microcontrollers) and was only retrospectively applied to the Harvard machines, and that the term could only be applied to the Mark III and IV, not to the Mark I or II.[citation needed] The main sequence mechanism was unidirectional. This meant that complex programs had to be physically lengthy. A program loop was accomplished by loop unrolling or by joining the end of the paper tape containing the program back to the beginning of the tape (literally creating a loop). At first, conditional branching in Mark I was performed manually. Later modifications in 1946 introduced automatic program branching (by subroutine call). The first programmers of the Mark I were computing pioneers Richard Milton Bloch, Robert Campbell, and Grace Hopper. There was also a small technical team whose assignment was to actually operate the machine; some had been IBM employees before being required to join the Navy to work on the machine. This technical team was not informed of the overall purpose of their work while at Harvard.[citation needed] Instruction format The 24 channels of the input tape were divided into three fields of eight channels each. Each storage location, each set of switches, and the registers associated with the input, output, and arithmetic units were assigned a unique identifying index number. These numbers were represented in binary on the control tape. The first field was the binary index of the result of the operation, the second was the source datum for the operation and the third field was a code for the operation to be performed. Contribution to the Manhattan Project In 1928 L.J. Comrie was the first to turn IBM "punched-card equipment to scientific use: computation of astronomical tables by the method of finite differences, as envisioned by Babbage 100 years earlier for his Difference Engine". Very soon after, IBM started to modify its tabulators to facilitate this kind of computation. One of these tabulators, built in 1931, was The Columbia Difference Tabulator. John von Neumann had a team at Los Alamos that used "modified IBM punched-card machines" to determine the effects of the implosion. In March 1944, he proposed to run certain problems regarding implosion on the Mark I, and he arrived at Harvard together with two mathematicians to write a simulation program to study the implosion of the first atomic bomb. The Los Alamos group completed its work in a much shorter time than the Cambridge group. However, the punched-card machine operation computed values to six decimal places, whereas the Mark I computed values to eighteen decimal places. Additionally, Mark I integrated the partial differential equation at a much smaller interval size [or smaller mesh] and so...achieved far greater precision. "Von Neumann joined the Manhattan Project in 1943, working on the immense number of calculations needed to build the atomic bomb. He showed that the implosion design, which would later be used in the Trinity and Fat Man bombs, was likely faster and more efficient than the gun design." Aiken and IBM Aiken published a press release announcing the Mark I listing himself as the sole inventor. James W. Bryce was the only IBM person mentioned, even though several IBM engineers including Clair Lake and Frank Hamilton had helped to build various elements. IBM chairman Thomas J. Watson was enraged, and only reluctantly attended the dedication ceremony on August 7, 1944. Aiken, in turn, decided to build further machines without IBM's help, and the ASCC came to be generally known as the "Harvard Mark I". IBM went on to build its Selective Sequence Electronic Calculator (SSEC) to both test new technology and provide more publicity for the company's efforts. Successors The Mark I was followed by the Harvard Mark II (1947 or 1948), Mark III/ADEC (September 1949), and Harvard Mark IV (1952) – all the work of Aiken. The Mark II was an improvement over the Mark I, although it still was based on electromechanical relays. The Mark III used mostly electronic components—vacuum tubes and crystal diodes—but also included mechanical components: rotating magnetic drums for storage, plus relays for transferring data between drums. The Mark IV was all-electronic, replacing the remaining mechanical components with magnetic core memory. The Mark II and Mark III were delivered to the US Navy base at Dahlgren, Virginia. The Mark IV was built for the US Air Force, but it stayed at Harvard.[citation needed] The Mark I was disassembled in 1959, and portions of it went on display in the Science Center, as part of the Harvard Collection of Historical Scientific Instruments. It was relocated to the new Science and Engineering Complex in Allston in July 2021. Other sections of the original machine had much earlier been transferred to IBM and the Smithsonian Institution. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_Multitap] | [TOKENS: 290]
Contents PlayStation Multitap The PlayStation Multitap is a peripheral for the PlayStation and PlayStation 2. It is an adapter that can be used to plug in up to four controllers and memory cards at the same time in a single controller port. With a second multitap, up to eight controllers and memory cards can be plugged at once. For 4-player games, the multitap must be plugged into controller port 1. Compatibility The PlayStation Multitap was originally available in gray (SCPH-1070 U) to match the original console's color; however, it was later re-released in white as well (SCPH-1070 UH) to match the colors of the later PS one redesign. Both versions are compatible with the original PS one, as well as all models of the PlayStation 2 prior to the SCPH-70000 series. Both versions of SCPH-1070 will only function with original PlayStation games, while multiplayer PS2 games required a separate multitap, the SCPH-10090. PlayStation 2 consoles from the SCPH-70000 series require the SCPH-70120 multitap, which is compatible with both PS and PS2 software. Supported games 1–3 Players 1-4 Players 1-5 Players 1-6 Players 1-8 Players 1-4 Players 1-5 Players 1-6 Players 1-8 Players References
========================================
[SOURCE: https://en.wikipedia.org/wiki/File:Mars_Before_and_During_Global_Dust_Storm.jpg] | [TOKENS: 301]
File:Mars Before and During Global Dust Storm.jpg Summary When Hubble photographed Mars in early September 2001 (right), the storm had already been raging across the planet for nearly two months, obscuring all surface features. The fine airborne dust blocks a significant amount of sunlight from reaching the Martian surface. Because the airborne dust is absorbing this sunlight, it heats the upper atmosphere. Seasonal global Mars dust storms have been observed from telescopes for over a century, but this is the biggest storm seen in the past several decades. Mars looks gibbous in the right photograph because is it 26 million miles farther from Earth than in the left photo (though the pictures have been scaled to the same angular size), and our viewing angle has changed. The left picture was taken when Mars was near its closest approach to Earth for 2001; at that point the disk of Mars was fully illuminated as seen from Earth because Mars was exactly opposite the Sun. For more information, visit: hubblesite.org/news_release/news/2001-31 Licensing 26 February 2020 File history Click on a date/time to view the file as it appeared at that time. File usage The following 2 pages use this file: Global file usage The following other wikis use this file: Metadata This file contains additional information, probably added from the digital camera or scanner used to create or digitize it. If the file has been modified from its original state, some details may not fully reflect the modified file.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Safed] | [TOKENS: 10577]
Contents Safed Safed (/ˈsɑːfɛd/ SAH-fed; Arabic: صَفَد, romanized: Ṣafad), also known as Tzfat and officially as Zefat (Hebrew: צְפַת, romanized: Ṣəp̄aṯ),[a] is a city in the Northern District of Israel. Located at an elevation of up to 937 m (3,074 ft), Safed is the highest city in the Galilee and in Israel. In 2022, 93.2% of the population was Jewish and 6.8% was counted as other. Safed has been identified with Sepph (Σέπφ), a fortified town in the Upper Galilee mentioned in the writings of the Roman Jewish historian Josephus. The Jerusalem Talmud mentions Safed as one of five elevated spots where fires were lit to announce the New Moon and festivals during the Second Temple period. Safed attained local prominence under the Crusaders, who built a large fortress there in 1168. It was conquered by Saladin 20 years later, and demolished by his grandnephew al-Mu'azzam Isa in 1219. After reverting to the Crusaders in a treaty in 1240, a larger fortress was erected, which was expanded and reinforced in 1268 by the Mamluk sultan Baybars, who developed Safed into a major town and the capital of a new province spanning the Galilee. After a century of general decline, the stability brought by the Ottoman conquest in 1517 ushered in nearly a century of growth and prosperity in Safed, during which time Jewish immigrants from across Europe developed the city into a center for wool and textile production and the mystical Kabbalah movement. It became known as one of the Four Holy Cities of Judaism. As the capital of the Safad Sanjak, it was the main population center of the Galilee, with large Muslim and Jewish communities. Besides during the fortunate governorship of Fakhr al-Din II in the early 17th century, the city underwent a general decline and by the mid-18th century was eclipsed by Acre. Its Jewish residents were targeted in Druze and local Muslim raids in the 1830s, and many perished in an earthquake in that same decade – through the philanthropy of Moses Montefiore, its Jewish synagogues and homes were rebuilt. Safed's population reached 24,000 toward the end of the 19th century; it was a mixed city, divided roughly equally between Jews and Muslims with a small Christian community. Its Muslim merchants played a key role as middlemen in the grain trade between the local farmers and the traders of Acre, while the Ottomans promoted the city as a center of Sunni jurisprudence. Safed's conditions improved considerably in the late 19th century, a municipal council was established along with a number of banks, though the city's jurisdiction was limited to the Upper Galilee. By 1922, Safed's population had dropped to around 8,700, roughly 60% Muslim, 33% Jewish and the remainder Christians. Amid rising ethnic tension throughout Mandatory Palestine, Safed's Jews were attacked in an Arab riot in 1929. The city's population had risen to 13,700 by 1948, overwhelmingly Arab, though the city was proposed to be part of a Jewish state in the 1947 UN Partition Plan. During the 1948 war, Arab factions attacked and besieged the Jewish quarter which held out until Jewish paramilitary forces captured the city after heavy fighting, precipitating British forces to withdraw. Most of the city's predominantly Palestinian-Arab population fled or were expelled as a result of attacks by Jewish forces and the nearby Ein al-Zeitun massacre, and were not allowed to return after the war, such that today the city has an almost exclusively Jewish population. That year, the city became part of the then-newly established state of Israel. Safed has a large Haredi community and remains a center for Jewish religious studies. Safed today hosts the Ziv Hospital as well as the Zefat Academic College. Safed is a major subject in Israeli art, it hosts an Artists' Quarter. Several prominent art movements played a role in the city, most notably the École de Paris. However the Artists' quarter has declined since its golden age in the second half of the 20th century. Due to its high elevation, the city has warm summers and cold, often snowy winters. Its mild climate and scenic views have made Safed a popular holiday resort frequented by Israelis and foreign visitors. In 2023 it had a population of 39,179. Biblical reference Legend has it that Safed was founded by a son of Noah after the Great Flood. According to the Book of Judges (Judges 1:17), the area where Safed is located was assigned to the tribe of Naphtali. It has been suggested that Jesus' assertion that "a city that is set on a hill cannot be hidden" referred to Safed. History Safed has been identified with Sepph, a fortified town in the Upper Galilee mentioned in the writings of the Roman-Jewish historian Josephus. Safed is mentioned in the Jerusalem Talmud as one of five elevated spots where fires were lit to announce the New Moon and festivals during the Second Temple period. There is scarce information about Safed before the Crusader conquest. A document from the Cairo Geniza, composed in 1034, mentions a transaction made in Tiberias in 1023 by a certain Jew, Musa ben Hiba ben Salmun with the nisba (Arabic descriptive suffix) "al-Safati" (of Safed), indicating the presence of a Jewish community living alongside Muslims in Safed in the 11th century. According to the Muslim historian Ibn Shaddad (d. 1285), at the beginning of the 12th century, a "flourishing village" beneath a tower called Burj Yatim had existed at the site of Safed on the eve of the Crusaders' capture of the area in 1101–1102 and that "nothing" about the village was mentioned in "the early Islamic history books". Although Ibn Shaddad mistakenly attributes the tower's construction to the Knights Templar, the modern historian Ronnie Ellenblum asserts that the tower was likely built during the early Muslim period (mid-7th–11th centuries). The Frankish chronicler William of Tyre noted the presence of a burgus (tower) in Safed, which he called "Castrum Saphet" or "Sephet", in 1157. Safed was the seat of a castellany (area governed by a castle) by at least 1165, when its castellan (appointed castle governor) was Fulk, constable of Tiberias. The castle of Safed was purchased from Fulk by King Amalric of Jerusalem in 1168. He subsequently reinforced the castle and transferred it to the Templars in the same year. Theoderich the Monk, describing his visit to the area in 1172, noted that the expanded fortification of the castle of Safed was meant to check the raids of the Turks (the Turkic Zengid dynasty ruled the area east of the Kingdom). Testifying to the considerable expansion of the castle, the chronicler Jacques de Vitry (d. 1240) wrote that it was practically built anew. The remains of Fulk's castle can now be found under the citadel excavations, on a hill above the old city. In the estimation of modern historian Havré Barbé, the castellany of Safed comprised approximately 376 square kilometers (145 mi2). According to Barbé, its western boundary straddled the domains of Acre, including the fief of St. George de la Beyne, which included Sajur and Beit Jann, and the fief of Geoffrey le Tor, which included Akbara and Hurfeish, and in the southwest ran north of Maghar and Sallama. Its northern boundary was marked by the Nahal Dishon (Wadi al-Hindaj) stream, its southern boundary was likely formed near Wadi al-Amud, separating it from the fief of Tiberias, while its eastern limits were the marshes of the Hula Valley and upper Jordan Valley. There were several Jewish communities in the castellany of Safed, as testified in the accounts of Jewish pilgrims and chroniclers between 1120 and 1293. Benjamin of Tudela, who visited the town in 1170, does not record any Jews living in Safed proper. Safed was captured by the Ayyubids led by Sultan Saladin in 1188 after a month-long siege, following the Battle of Hattin in 1187. Saladin ultimately allowed its residents to relocate to Tyre. He granted Safed and Tiberias as an iqta (akin to a fief) to Sa'd al-Din Mas'ud ibn Mubarak (d. 1211), the son of his niece, after which it was bequeathed to Sa'd al-Din's son Ahmad. Samuel ben Samson, who visited the town in 1210, mentions the existence of a Jewish community of at least fifty there. He also noted that two Muslims guarded and maintained the cave tomb of a rabbi, Hanina ben Horqano, in Safed. The iqta of Safed was taken from the family of Sa'd al-Din by the Ayyubid emir of Damascus, al-Mu'azzam Isa, in 1217. Two years later, during the Crusader siege of Damietta, al-Mu'azzam Isa had the Safed castle demolished to prevent its capture and reuse by potential future Crusaders. As an outcome of the treaty negotiations between the Crusader leader Theobald I of Navarre and the Ayyubid al-Salih Ismail, Emir of Damascus, in 1240 Safed once again passed to Crusader control. Afterward, the Templars were tasked with rebuilding the Citadel of Safed, with efforts spearheaded by Benedict of Alignan, Bishop of Marseille. The rebuilding is recorded in a short treatise, De constructione castri Saphet, from the early 1260s. The reconstruction was completed at the considerable expense of 40,000 bezants in 1243. The new fortress was larger than the original, with a capacity for 2,200 soldiers in time of war, and with a resident force of 1,700 in peacetime. The garrison's goods and services were provided by the town or large village growing rapidly beneath the fortress, which, according to Benoit's account, contained a market, "numerous inhabitants" and was protected by the fortress. The settlement also benefited from trade with travelers on the route between Acre and the Jordan Valley, which passed through Safed. The Ayyubids of Egypt had been supplanted by the Mamluks in 1250 and the Mamluk sultan Baybars entered Syria with his army in 1261. Thereafter, he led a series of campaigns over several years against Crusader strongholds across the Syrian coastal mountains. Safed, with its position overlooking the Jordan River and allowing the Crusaders early warnings of Muslim troop movements in the area, had been a consistent aggravation for the Muslim regional powers. After a six-week siege, Baybars captured Safed in July 1266, after which he had nearly the entire garrison killed. The siege occurred during a Mamluk military campaign to subdue Crusader strongholds in Palestine and followed a failed attempt to capture the Crusaders' coastal stronghold of Acre. Unlike the Crusader fortresses along the coastline, which were demolished upon their capture by the Mamluks, Baybars spared the fortress of Safed. He likely preserved it because of the strategic value stemming from its location on a high mountain and its isolation from other Crusader fortresses. Moreover, Baybars determined that in the event of a renewed Crusader invasion of the coastal region, a strongly fortified Safed could serve as an ideal headquarters to confront the Crusader threat. In 1268, he had the fortress repaired, expanded and strengthened. He commissioned numerous building works in the town of Safed, including caravanserais, markets and baths, and converted the town's church into a mosque. The mosque, called Jami al-Ahmar (the Red Mosque), was completed in 1275. By the end of Baybars's reign, Safed had developed into a prosperous town and fortress. Baybars assigned fifty-four mamluks, at the head of whom was Emir Ala al-Din Kandaghani, to oversee the management of Safed and its dependencies. From the time of its capture, the city was made the administrative center of Mamlakat Safad, one of seven mamlakas (provinces), whose governors were typically appointed from Cairo, which made up Mamluk Syria. Initially, its jurisdiction corresponded roughly with the Crusader castellany. After the fall of the Montfort Castle to the Mamluks in 1271, the castle and its dependency, the Shaghur district, were incorporated into Mamlakat Safad. The territorial jurisdiction of the mamlaka eventually spanned the entire Galilee and the lands further south down to Jenin. The geographer al-Dimashqi, who died in Safed in 1327, wrote around 1300 that Baybars built a "round tower and called it Kullah ..." after leveling the old fortress. The tower was built in three stories, and provided with provisions, halls, and magazines. Under the structure, a cistern collected enough rainwater to regularly supply the garrison. The governor of Safed, Emir Baktamur al-Jukandar (the Polomaster; r. 1309–1311), built a mosque later called after him in the northeastern section of the city. The geographer Abu'l Fida (1273–1331), the ruler of Hama, described Safed as follows: [Safed] was a town of medium size. It has a very strongly built castle, which dominates the Lake of Tabariyyah [Sea of Galilee]. There are underground watercourses, which bring drinking-water up to the castle-gate...Its suburbs cover three hills... Since the place was conquered by Al Malik Adh Dhahir [Baybars] from the Franks [Crusaders], it has been made the central station for the troops who guard all the coast-towns of that district." The native qadi (Islamic head judge) of Safed, Shams al-Din al-Uthmani, composed a text about Safed called Ta'rikh Safad (the History of Safed) during the rule of its governor Emir Alamdar (r. 1372–1376). The extant parts of the work consisted of ten folios largely devoted to Safed's distinguishing qualities, its dependent villages, agriculture, trade and geography, with no information about its history. His account reveals the city's dominant features were its citadel, the Red Mosque and its towering position over the surrounding landscape. He noted Safed lacked "regular urban planning", madrasas (schools of Islamic law), ribats (hostels for military volunteers) and defensive walls, and that its houses were clustered in disarray and its streets were not distinguishable from its squares. He attributed the city's shortcomings to the dearth of generous patrons. A device for transporting buckets of water called the satura existed in the city mainly to supply the soldiers of the citadel; surplus water was distributed to the city's residents. Al-Uthmani praised the natural beauty of Safed, its therapeutic air, and noted that its residents took strolls in the surrounding gorges and ravines. The Black Death brought about a decline in the population in Safed from 1348 onward. There is little available information about the city and its dependencies during the last century of Mamluk rule (c. 1418 – c. 1516), though travelers' accounts describe a general decline precipitated by famine, plagues, natural disasters and political instability. In 1481, Joseph Mantabia reported that 300 Jewish families lived in Safed and its surrounding villages. While the accuracy of this figure is uncertain, it reflects the town's growing importance as a center of Jewish life, particularly with the arrival of Sephardic Jews due to persecutions in Portugal and Spain. The Ottomans conquered Mamluk Syria following their victory at the Battle of Marj Dabiq in northern Syria in 1516. Safed's inhabitants sent the keys of the town citadel to Sultan Selim I after he captured Damascus. No fighting was recorded around Safed, which was bypassed by Selim's army on the way to Mamluk Egypt. The sultan had placed the district of Safed under the jurisdiction of the Mamluk governor of Damascus, Janbirdi al-Ghazali, who defected to the Ottomans. Rumors in 1517 that Selim was slain by the Mamluks precipitated a revolt against the newly appointed Ottoman governor by the townspeople of Safed, which resulted in wide-scale killings, many of which targeted the city's Jews, who were viewed as sympathizers of the Ottomans. Safed became the capital of the Safed Sanjak, roughly corresponding with Mamlakat Safad but excluding most of the Jezreel Valley and the area of Atlit, part of the larger province of Damascus Eyalet. In 1525/26, the population of Safed consisted of 633 Muslim families, 40 Muslim bachelors, 26 Muslim religious persons, nine Muslim disabled, 232 Jewish families, and 60 military families. In 1549, under Sultan Suleiman the Magnificent, a wall was constructed and troops were garrisoned to protect the city. In 1553/54, the population consisted of 1,121 Muslim households, 222 Muslim bachelors, 54 Muslim religious leaders, 716 Jewish households, 56 Jewish bachelors, and 9 disabled persons. At least in the 16th century, Safed was the only kasaba (city) in the sanjak and in 1555 was divided into nineteen mahallas (quarters), seven Muslim and twelve Jewish. The total population of Safed rose from 926 households in 1525–26 to 1,931 households in 1567–1568. Among these, the Jewish population rose from a mere 233 households in 1525 to 945 households in 1567–1568. The Muslim quarters were Sawawin, located west of the fortress; Khandaq (the moat); Ghazzawiyah, which had likely been settled by Gazans; Jami' al-Ahmar (the Red Mosque), located south of the fortress and named for the local mosque; al-Akrad, which dated to the Middle Ages and continued to exist through the 19th century, and whose inhabitants mainly were Kurds; al-Wata (the lower), the southernmost quarter of Safed and situated below the city; and al-Suq, named after the market or mosque located within the quarter. The Jewish quarters were all situated west of the fortress. Each quarter was named for the place of origin of its inhabitants: Purtuqal (Portugal), Qurtubah (Cordoba), Qastiliyah (Castille), Musta'rib (Jews of local, Arabic-speaking origin), Magharibah (northwestern Africa), Araghun ma' Qatalan (Aragon and Catalonia), Majar (Hungary), Puliah (Apulia), Qalabriyah (Calabria), Sibiliyah (Seville), Taliyan (Italian) and Alaman (German). In the 15th and 16th centuries there were several well-known Sufis (mystics) of ibn Arabi living in Safed. The Sufi sage Ahmad al-Asadi (1537–1601) established a zawiya (Sufi lodge) called Sadr Mosque in the city. Safed became a center of Kabbalah (Jewish mysticism) during the 16th century. After the expulsion of the Jews from Spain in 1492, many prominent rabbis found their way to Safed, among them the Kabbalists Isaac Luria and Moses ben Jacob Cordovero; Joseph Caro, the author of the Shulchan Aruch; and Solomon Alkabetz, composer of the Shabbat hymn "Lekha Dodi". The kabbalistic response to the trauma of the exile varied widely, ranging from a quietistic approach adopted by the Italian and North African kabbalists, to a more activist apocalyptic approach which sought signs of the imminent redemption. The expulsion was seen by many as the tribulation that would herald the beginning of the messianic age as foretold in rabbinic literature. The spiritualization of religious life culminated in the creative outburst of religious innovation in Safed in the second half of the sixteenth century as a response to the expulsion. This spiritual revolution spread from Safed and transformed the practice of Judaism throughout the Jewish world. The influx of Sephardic Jews—reaching its peak under the rule of sultans Suleiman the Magnificent and Selim II—made Safed a global center for Jewish learning and a regional center for trade throughout the 15th and 16th centuries. Sephardi Jews and other Jewish immigrants by then outnumbered Musta'arabi Jews in the city. During this period, the Jewish community developed the textile industry in Safed, transforming the town into an important and lucrative wool production and textile manufacturing centre. There were more than 7,000 Jews in Safed in 1576 when Murad III proclaimed the forced deportation of 1,000 wealthy Jewish families to Cyprus to boost the island's economy. There is no evidence that the edict or a second one issued the following year for removing 500 families, was enforced. In 1584, there were 32 synagogues registered in the town. A Hebrew printing press, the first in West Asia, was established in Safed in 1577 by Eliezer ben Isaac Ashkenazi of Prague and his son, Isaac. By the early part of the 17th century, Safed was a small town. In 1602, the paramount chief of the Druze in Mount Lebanon, Fakhr al-Din II of the Ma'n dynasty, was appointed the sanjak-bey (district governor) of Safed, in addition to his governorship of neighbouring Sidon-Beirut Sanjak to the north. In the preceding years, the Safed Sanjak had entered a state of ruin and desolation and was often the scene of conflict between the local Druze and Shia Muslim peasants and the Ottoman authorities. By 1605, Fakhr al-Din had established peace and security in the sanjak, with highway brigandage and Bedouin raids having ceased under his watch. Trade and agriculture consequently thrived and the population prospered. He formed close relations with the city's Sunni Muslim ulama (religious scholars), particularly the mufti, al-Khalidi al-Safadi of the Hanafi school of fiqh (Islamic jurisprudence), who became his practical court historian. The Ottomans drove Fakhr al-Din into European exile in 1613, but his son Ali became governor in 1615. Fakhr al-Din returned to his domains in 1618 and five years later regained the governorship of Safed, which the Ma'n dynasty had lost, after his victory against the governor of Damascus at the Battle of Anjar. In c. 1625, the orientalist Franciscus Quaresmius spoke of Safed being inhabited "chiefly by Hebrews, who had their synagogues and schools, and for whose sustenance contributions were made by the Jews in other parts of the world." According to the historian Louis Finkelstein, the Jewish community of Safed was plundered by the Druze under Mulhim ibn Yunus, nephew of Fakhr al-Din. Five years later, Fakhr al-Din was routed by the Ottoman governor of Damascus, Mulhim abandoned Safed, and its Jewish residents returned.[dubious – discuss] The Druze again attacked the Jews of Safed in 1656. During the power struggle between Fakhr al-Din's heirs (1658–1667), each faction attacked Safed. In the intra-communal turmoil among the Druze following the death of Mulhim, the 1660 destruction of Safed targeted the Jews there and in Tiberias; only a few of the former Jewish residents returned to the city before 1662. Survivors relocated mainly to Sidon or Jerusalem. Safed Sanjak and the neighbouring Sidon-Beirut Sanjak to the north were administratively separated from Damascus in 1660 to form the Sidon Eyalet, of which Safed was briefly the capital. The province was created by the imperial government to check the power of the Druze of Mount Lebanon, as well as the Shia of Jabal Amil. As nearby Tiberias remained desolate for several decades, Safed gained a key position among Galilean Jewish communities. In 1665, the Sabbatai Sevi movement arrived in Safed.[citation needed] In the 1670s, the account of the Turkish traveller Evliya Çelebi recorded that Safed contained three caravanserais, several mosques, seven zawiyas, and six hammams. The Red Mosque was restored by Safed's governor Salih Bey in 1671/72, at which point it measured about 120 by 80 feet (37 m × 24 m), had all masonry interior, a cistern to collect rainwater in the winter for drinking and a tall minaret over its southern entrance; the minaret had been destroyed before the end of the 17th century. The Tiberias-based sheikh Daher al-Umar of the local Arab Zaydan clan, whose father Umar al-Zaydani had been the governor and tax farmer of Safed in 1702–1706, wrested control of Safed and its tax farm from its native strongman, Muhammad Naf'i, through military pressure and diplomacy by 1740. The Naf'i, Shahin, and Murad families continued to farm the taxes of Safed and its countryside into the 1760s as Daher's subordinates. By the 1760s, Daher entrusted Safed to his son Ali, who made the town his headquarters. After Daher was killed by Ottoman imperial forces, the governor of Sidon, Jazzar Pasha, moved to oust Daher's sons from their Galilee strongholds. Ali made a final, unsuccessful stand against Jazzar Pasha from Safed, which was afterward captured and garrisoned by the governor. The simultaneous rise of Acre, established by Daher as his capital in 1750 and which served as the capital of the Sidon Eyalet under Jazzar Pasha (1775–1804) and his successors, Sulayman Pasha al-Adil (1805–1819) and Abdullah Pasha (1820–1831), contributed to the political decline of Safed. It became a subdistrict center with limited local influence, belonging to the Acre Sanjak . Underdevelopment and a series of natural disasters further contributed to Safed's decline during the 17th–mid-19th centuries. An outbreak of plague decimated the population in 1742 and the Near East earthquakes of 1759 left the city in ruins, killing 200 residents. An influx of Russian Jews in 1776 and 1781, and of Lithuanian Jews of the Perushim movement in 1809 and 1810, reinvigorated the Jewish community. In 1812, another plague killed 80% of the Jewish population. Following Abdullah Pasha of Acre's ordered killing of his Jewish vizier Haim Farhi, who served the same post under Jazzar and Sulayman, the governor imprisoned the Jewish residents of Safed on 12 August 1820, accusing them of tax evasion under the concealment of Farhi; they were released upon paying a ransom. The war between Abdullah Pasha and the influential Farhi brothers in Constantinople and Damascus in 1822–1823 prompted Jewish flight from the Galilee in general, though by 1824 Jewish immigrants were steadily moving to the city. The forces of Muhammad Ali of Egypt wrested control of the Levant from the Ottomans in 1831 and in the same year many Jews who had fled the Galilee, including Safed, under Abdullah Pasha returned as a result of Muhammad Ali's liberal policies toward Jews. Safed was raided by Druze in 1833 at the approach of Ibrahim Pasha, the Egyptian governor of the Levant. In the following year, the Muslim notables of the city, led by Salih al-Tarshihi, opposed to the Egyptian policy of conscription, joined the peasants' revolt in Palestine. During the revolt, rebels plundered the city for over thirty days. Emir Bashir Shihab II of Mount Lebanon and his Druze fighters entered its environs in support of the Egyptians and compelled Safed's leaders to surrender. The Galilee earthquake of 1837 killed about half of Safed's 4,000-strong Jewish community, destroyed all fourteen of its synagogues and prompted the flight of 600 Perushim for Jerusalem; the surviving Sephardic and Hasidic Jews mostly remained. Among the 2,158 residents of Safed who had died, 1,507 were Ottoman subjects, the rest foreign citizens. The Jewish quarter was situated on the hillside and was particularly hard hit; the southern and Muslim section of the town experienced considerably less damage. The following year, in 1838, Druze rebels and local Muslims raided Safed for three days. Ottoman rule was restored across the Levant in 1840. The Empire-wide Tanzimat reforms, which were first adopted in the 1840s, brought about a steady rise in Safed's population and economy. In 1849 Safed had a total estimated population of 5,000, of whom 2,940–3,440 were Muslims, 1,500-2,000 were Jews and 60 were Christians. The population was estimated at 7,000 in 1850–1855, of whom 2,500–3,000 were Jews. The Jewish population increased in the last half of the 19th century by immigration from Persia, Morocco, and Algeria. Moses Montefiore (d. 1885) visited Safed seven times and financed much of the rebuilding of Safed's synagogues and Jewish houses. In 1864 the Sidon Eyalet was absorbed into the new province of Syria Vilayet. In the new province, Safed remained part of the Acre Sanjak and served as the center of a kaza (third-level subdivision), whose jurisdiction covered the villages around the city and the subdistrict of Mount Meron (Jabal Jarmaq). In the Ottoman survey of Syria in 1871, Safed had 1,395 Muslim households, 1,197 Jewish households and three Christian households. The survey recorded a relatively high number of businesses in the city, namely 227 shops, fifteen mills, fourteen bakeries and four olive oil factories, an indicator of Safed's long-established role as an economic hub for the people of the Upper Galilee, the Hula Valley, the Golan Heights and parts of modern-day South Lebanon. Through the late 19th century, Safed's merchants served as middlemen in the Galilee grain trade, selling the wheat, pulses and fruit grown by the peasants of the Galilee to the traders of Acre, who in turn exported at least part of the merchandise to Europe. Safed also maintained extensive trade with the port of Tyre. The bulk of trade in Safed, which was traditionally dominated by the city's Jews, largely passed to its Muslim merchants during the late 19th century, particularly trade with the local villagers; Muslim traders offered higher credit to the peasants and were able to obtain government assistance for debt repayments. The wealth of Safed's Muslims increased and a number of the city's leading Muslim families made an opportunity from the Ottoman Land Code of 1858 to purchase extensive tracts around Safed. The major Muslim landowning clans were the Soubeh, Murad and Qaddura. The latter owned about 50,000 dunams toward the end of the century, including eight villages around Safed. In 1878 the municipal council of Safed was established. In 1888 the Acre Sanjak, including the Safed Kaza, became part of the new province of Beirut Vilayet, an administrative state of affairs which persisted until the Empire's fall in 1918. The centralization and stability brought by the imperial reforms solidified the political status and practical influence of Safed in the Upper Galilee. The Ottomans developed Safed into a center for Sunni Islam to counterbalance the influence of non-Muslim communities in its environs and the Shia Muslims of Jabal Amil. Along with the three major landowning families, the Muslim ulema (religious scholarly) families of Nahawi, Qadi, Mufti and Naqib comprised the urban elite (a'yan) of the city. The Sunni courts of Safed arbitrated over cases in Akbara, Ein al-Zeitun and as far away as Mejdel Islim. According to the late 19th-century account of British missionary E. W. G. Masterman, the Muslim families of Safed included Kurds, Damascenes, Algerians, Bedouin from the Jordan Valley, and people from the villages around Safed. Many Damascenes had been settled in the city by Baybars when he conquered Safed in 1266. Until the late 19th century the Muslims of Safed maintained strong social and cultural connections with Damascus. The government settled Algerian and Circassian exiles in the countryside of Safed in the 1860s and 1878, respectively, possibly in an effort to strengthen the Muslim character of the area. At least two Muslim families in the city itself, Arabi and Delasi, were of Algerian origin, though they accounted for a small proportion of the city's overall Muslim population. Masterman noted that the Muslims of Safed were conservative, "active and hardy", who "dress[ed] well and move[d] about more than the people from the region of southern Palestine". They lived mainly in three quarters of the city: al-Akrad, whose residents were mostly laborers, Sawawin, home to the Muslim a'yan households and the city's Catholic community, and al-Wata, whose inhabitants were largely shopkeepers and minor traders. The entire Jewish population lived in the Gharbieh (western) quarter. Safed's population reached over 15,000 in 1879, 8,000 of whom were Muslims and 7,000 Jews. A population list from about 1887 showed that Safad had 24,615 inhabitants; 2,650 Jewish households, 2,129 Muslim households and 144 Roman Catholic households. Arab families in Safed whose social status rose as a result of the Tanzimat reforms included the Asadi, whose presence in Safed dated to the 16th century, Hajj Sa'id, Hijazi, Bisht, Hadid, Khouri, a Christian family whose progenitor moved to the city from Mount Lebanon during the 1860 civil war, and Sabbagh, a long-established Christian family in the city related to Daher al-Umar's fiscal adviser Ibrahim al-Sabbagh; many members of these families became officials in the civil service, local administrations or businessmen. When the Ottomans established a branch of the Agricultural Bank in the city in 1897, all of its board members were resident Arabs, the most influential of whom were Husayn Abd al-Rahim Effendi, Hajj Ahmad al-Asadi, As'ad Khouri and Abd al-Latif al-Hajj Sa'id. The latter two also became board members of the Chamber of Commerce and Agriculture branch opened in Safed in 1900. In the last decade of the 19th century, Safed contained 2,000 houses, four mosques, three churches, two public bathhouses, one caravanserai, two public sabils, nineteen mills, seven olive oil presses, ten bakeries, fifteen coffeehouses, forty-five stalls and three shops. Safed was the centre of Safad Subdistrict. According to a census conducted in 1922 by the British Mandate authorities, Safed had a population of 8,761 inhabitants, consisting of 5,431 Muslims, 2,986 Jews, 343 Christians and others. Safed remained a mixed city during the British Mandate for Palestine and ethnic tensions between Jews and Arabs rose during the 1920s. During the 1929 Palestine riots, Safed and Hebron became major clash points. In the Safed massacre 20 Jewish residents were killed by local Arabs. Safed was included in the part of Palestine recommended to be included in the proposed Jewish state under the United Nations Partition Plan for Palestine. In 1948 the city was home to about 12,000 Arabs and about 1,700 Jews, mostly religious and elderly. On 5 January 1948, Arabs attacked the Jewish Quarter. In February 1948, during the civil war, Muslim Arabs attacked a Jewish bus attempting to reach Safed, and the Jewish quarter of the town came under siege by the Muslims. British forces that were present did not intervene. According to Martin Gilbert, food supplies ran short. "Even water and flour were in desperately short supply. Each day, the Arab attackers drew closer to the heart of the Jewish quarter, systematically blowing up Jewish houses as they pressed in on the central area." On April 16, the same day that British forces evacuated Safed, 200 local Arab militiamen, supported by over 200 Arab Liberation Army soldiers, tried to take over the city's Jewish Quarter. They were repelled by the Jewish garrison, consisting of some 200 Haganah fighters, men and women, boosted by a Palmach platoon. The Palmach ground attack on the Arab section of Safed took place on 6 May, as a part of Operation Yiftach. The first phase of the Palmach plan to capture Safed, was to secure a corridor through the mountains by capturing the Arab village of Biriyya. The Arab Liberation Army placed artillery pieces on a hill adjacent to the Jewish quarter and started its shelling. The Palmach's Third Battalion failed to take the main objective, the "citadel", but "terrified" the Arab population sufficiently to prompt further flight, as well as urgent appeals for outside help and an effort to obtain a truce. The secretary-general of the Arab League Abdul Rahman Hassan Azzam stated that the goal of Plan Dalet was to drive out the inhabitants of Arab villages along the Syrian and Lebanese frontiers, particularly places on the roads by which Arab regular forces could enter the country. He noted that Acre and Safed were in particular danger. However, the appeals for help were ignored, and the British, now less than a week away from the end of the British Mandate of Palestine, also did not intervene against the second and final Haganah attack, which began on the evening of 9 May, with a mortar barrage on key sites in Safed. Following the barrage, Palmach infantry, in bitter fighting, took the citadel, Beit Shalva and the police fort, Safed's three dominant buildings. Through 10 May, Haganah mortars continued to pound the Arab neighbourhoods, causing fires in the marked area and in the fuel dumps, which exploded. "The Palmah 'intentionally left open the exit routes for the population to "facilitate" their exodus...' " According to Gilbert, "The Arabs of Safed began to leave, including the commander of the Arab forces, Adib Shishakli (later Prime Minister of Syria). With the police fort on Mount Canaan isolated, its defenders withdrew without fighting. The fall of Safed was a blow to Arab morale throughout the region... With the invasion of Palestine by regular Arab armies believed to be imminent – once the British had finally left in eleven or twelve days' time – many Arabs felt that prudence dictated their departure until the Jews had been defeated and they could return to their homes. According to Abbasi, the exodus of the Arabs of Safed had three phases. The first was due to the departure of the British compounded by the failure of an attack on the Jewish quarter and a disagreement between the Jordanian and Syrian commanders. The second was due to the fall of nearby Ein al-Zeitun and the massacre that Jewish forces committed there. The third was due to the deliberate creation of panic by Jewish forces. Some 12,000 Arabs, with some estimates reaching 15,000, fled Safed and were a "heavy burden on the Arab war effort". Among them was the family of Palestinian Authority President Mahmoud Abbas.[b] The city was fully under the control of Jewish paramilitary forces by May 11, 1948. Early in June, Jewish dignitaries from Safed journeyed to Tel Aviv to ask the government to block the return of Arabs to the city, threatening to abandon it if the latter were allowed back. They reasoned that since most of the Arabs' property had been seized or stolen in the meantime, the Jewish community would be unable to withstand the pressure of the returnees' demands for restitution. In 1974, 25 Israeli Jews (mainly school children) from Safed, were killed in the Ma'alot massacre. Over 1990s and early 2000s, the town accepted thousands of Russian Jewish immigrants and Ethiopian Beta Israel. In July 2006, "Katyusha" rockets fired by Hezbollah from Southern Lebanon hit Safed, killing one man and injuring others. Many residents fled the town for the duration of the conflict. On July 22, four people were injured in a rocket attack. The town has retained its unique status as a Jewish studies centre, incorporating numerous facilities. In 2010, eighteen senior rabbis led by the chief rabbi of Safed, Shmuel Eliyahu, issued an edict urging the city's residents not to rent or sell property to Arabs, warning of an "Arab takeover"; Arabs constitute a fractional proportion of the population, and the statement was generally perceived to be directed at the 1,300 Arab students enrolled at Zefat Academic College. Mayors Demographics In 2008, the population of Safed was 32,000. According to CBS figures in 2001, the ethnic makeup of the city was 99.2% Jewish and non-Arab, with no significant Arab population. 43.2% of the residents were 19 years of age or younger, 13.5% between 20 and 29, 17.1% between 30 and 44, 12.5% from 45 to 59, 3.1% from 60 to 64, and 10.5% 65 years of age or older. The city is home to a relatively large community of Haredi Jews. The village of Akbara in the city's southwestern outskirts, which had a population of about 500 Arab Muslims, most of whom belonged to a single clan, the Halihal, is under Safed's municipal jurisdiction. Seismology The city is located above the Dead Sea Transform, and is one of the cities in Israel most at risk of earthquakes (along with Tiberias, Beit She'an, Kiryat Shmona, and Eilat). Geography Safed is 40 kilometers (25 mi) east of Acre and 20 kilometers (12 mi) north of Tiberias. Safed has a Mediterranean climate (Köppen climate classification: Csa) with hot, dry summers and cool, rainy and occasionally snowy winters. The city receives 682 mm (27 in) of precipitation per year. Summers are rainless and hot with an average high temperature of 31 °C (88 °F) and an average low temperature of 20 °C (68 °F). Winters are cool and wet, and precipitation is occasionally in the form of snow. Winters have an average high temperature of 10 °C (50 °F) and an average low temperature of 5 °C (41 °F). Education According to CBS, the city has 25 schools and 6,292 students. There are 18 elementary schools with a student population of 3,965, and 11 high schools with a student population of 2,327. 40.8% of Safed's 12th graders were eligible for a matriculation (bagrut) certificate in 2001. The Zefat Academic College, originally an extension of Bar-Ilan University, was granted independent accreditation by Israel's Council of Higher Education in 2007. For the 2011–2012 school year, the college began a program designed specifically for Haredi Judaism. It was created in order to allow haredi women living in the Upper Galilee access to higher education, while still maintaining strict religious practice. The program accomplishes this goal through separate classes for male and female students. The classes are also taught during certain hours as to allow women to fulfill other aspects of their religiosity. In October 2011, Israel's fifth medical school opened in Safed, housed in a renovated historic building in the centre of town that was once a branch of Hadassah Hospital. The Azrieli Faculty of Medicine opened in 2011 as an extension of Bar-Ilan University, created to train physicians in the Upper Galilee region. The schools conducts clinical instructions in six hospitals in the region: On March 8, 2021, the Israeli Prime-Minister Benjamin Netanyahu announced that Israel is to establish its 10th university in Safed, after a growing need for a university in the northern district of Israel. Plans have been in place to establish a university in the Galilee since 2005, but no progress was made until 2015 when Netanyahu vowed to start working on the project during a Galilee Conference. As one of Judaism's Holy Cities, Safed hosts several Yeshivas. The Haredi Yeshivat Tzfat and associated institutions are headed by Rabbi Mordechai Kaplan. The Religious Zionist Hesder Yeshiva of Tzfat was founded in 1997 by Rabbi Benyahu Broner and is today headed by Rabbi Shemuel Eliyahu with approximately 120 students. For women, Sharei Bina is a midrasha (seminary) offering a one-year post high school program, with an increased focus on Jewish spirituality - including formal study of Kabbalistic topics. Chabad has several institutions including Machon Alte for women, and the advanced Kollel Tzemach Tzedek. The Livnot U'Lehibanot program in Safed provides an open, non-denominational atmosphere for young Jewish adults that combines volunteering, hiking and study with exploring Jewish heritage. Culture In the 1950s and 1960s, Safed was known as Israel's art capital. An artists' colony established in the old Arab quarter was a hub of creativity that drew artists from around the country, among them Yitzhak Frenkel, Yosl Bergner, Moshe Castel, Menachem Shemi, Shimshon Holzman and Rolly Schaffer. In honor of the opening of the Glitzenstein Art Museum in 1953, the artist Mane Katz donated eight of his paintings to the city. Today the area contains a large number of galleries and workshops run by individual artists and art vendors. There are several museums and galleries that function in the historical homes of major Israeli artists such as the Frenkel Frenel Museum and the Beit Castel gallery (in Moshe Castel's former home). In the 1960s, Safed was home to the country's top nightclubs, hosting the debut performances of Naomi Shemer, Aris San, and other singers. Nowadays, Safed has been hailed as the klezmer capital of the world, hosting an annual Klezmer Festival that attracts top musicians from around the globe. A school of world music, especially eastern music called Maqamat operates in the Artists' Quarter of Safed. Historic sites The Citadel Hill, in Hebrew HaMetzuda, rises east of the Old City and is named after the huge Crusader and then Mamluk castle built there during the 12th and 13th centuries, which continued in use until being totally destroyed by the 1837 earthquake. Its ruins are still visible. On the western slope beneath the ruins stands the former British police station, still pockmarked by bullet holes from the 1948 war. Before 1948, most of Safed's Jewish population used to live in the northern section of the old city. Currently home to 32 synagogues, it is also referred to as the synagogue quarter and includes synagogues named after prominent rabbis of the town: the Abuhav, Alsheich, Karo and two named for Rabbi Isaac Luria: one Ashkenazi, the other Sephardi. Further south are two monumental Mamluk-period buildings: Southeast of the Artists' Quarter is the Saraya, the fortified governor's residence built by Daher al-Umar (1689/90–1775). A report about the "obliteration of non-Jewish historic sites in Safed" mentions a mausoleum, an ancient grave and an ancient mosque that was converted into a clubhouse. Notable people Twin towns – sister cities Safed is twinned with: Gallery See also Notes References Bibliography External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Unfair_business_practices] | [TOKENS: 757]
Contents Unfair business practices Unfair business practices (also unfair commercial practices) describes a set of actions by commercial organizations which are considered unjust, and which may be unlawful. It includes practices which are covered by other areas of law, such as fraud, misrepresentation, and oppressive or unconscionable contract terms. Protections may be afforded to business-to-business dealings, or may be limited to those dealing as consumers. Regulation of such practices is a departure from traditional views of freedom to agree on contractual terms, summed up in the 1804 French Civil Code as qui dit contractuel dit juste (roughly, anything contractual is fair). Canada Canadian provinces enact their own consumer protection laws which differ in scope and coverage. For example, Saskatchewan's Consumer Protection Act says: It is an unfair practice for a supplier, in a transaction or proposed transaction involving goods or services, to: (a) do or say anything, or fail to do or say anything, if as a result a consumer might reasonably be deceived or misled; (b) make a false claim; (c) take advantage of a consumer if the person knows or should reasonably be expected to know that the consumer: (i) is not in a position to protect his or her own interests; or (ii) is not reasonably able to understand the nature of the transaction or proposed transaction. For example, the Saskatchewan Act has been applied to a case in which an automobile dealer mis-sold cars, based on allegations of false claims, representing goods as new or unused when they weren't, and using exaggeration, innuendo, or ambiguity when representing material facts. Other Canadian provinces have laws headed Unfair Trade Practices Act (Alberta), "Trade Practices Act" (B.C.), "Business Practices Act" (Ontario), Consumer Protection Act (Quebec), and Trade Practices Inquiry Act (Manitoba). European Union Under the Unfair Commercial Practices Directive 2005 (amended 2017) each member state is required to regulate unfair business practices. As the EU described its objectives: The objective of the EU Directive on unfair commercial practices from 2005 was to boost consumer confidence and make it easier for businesses, especially small and medium-sized enterprises, to trade across borders. It is the overarching EU legislation regulating unfair commercial practices that occur before, during and after a business-to-consumer transaction has taken place. For instance, in August 2025 a German court stopped Apple Watch advertising as a "CO2-neutral product" finding the claim misleading to consumers. United States In the United States, the Federal Trade Commission addresses unfair business practices. It has in the past included in its mission the goal of preventing "fraud, deception, and unfair business practices in the marketplace". It does so by "collecting reports from consumers and conducting investigations, suing companies and people that break the law, developing rules to maintain a fair marketplace, and educating consumers and businesses about their responsibilities". Individual states within the U.S. are also responsible for protecting consumers against unfair practices.[citation needed] United Kingdom In the United Kingdom unfair business practices are treated as unfair terms in English contract law by the Unfair Contract Terms Act 1977, Consumer Rights Act 2015 and Consumer Protection from Unfair Trading Regulations. Australia In Australia unfair business practices are regulated under the Australian Consumer Law which is enforced by the Australian Competition & Consumer Commission (ACCC). For example, in 2023 the ACCC took action against airline Qantas for, among other things, advertising and allowing customers to book unavailable flights. The law provides, among other things, that "A person must not, in trade or commerce, engage in conduct that is misleading or deceptive or is likely to mislead or deceive." See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_note-Dunn_Giribet_Edgecombe_Hejnol_2014-133] | [TOKENS: 6011]
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/IBM_SSEC] | [TOKENS: 2435]
Contents IBM SSEC The IBM Selective Sequence Electronic Calculator (SSEC) was an electromechanical computer built by IBM. Its design was started in late 1944 and it operated from January 1948 to August 1952. It had many of the features of a stored-program computer, and was the first operational machine able to treat its instructions as data, but it was not fully electronic. Although the SSEC proved useful for several high-profile applications, it soon became obsolete. As the last large electromechanical computer ever built, its greatest success was the publicity it provided for IBM. History During World War II, International Business Machines Corporation (IBM) funded and built an Automatic Sequence Controlled Calculator (ASCC) for Howard H. Aiken at Harvard University. The machine, formally dedicated in August 1944, was widely known as the Harvard Mark I. The President of IBM, Thomas J. Watson Sr., did not like Aiken's press release that gave no credit to IBM for its funding and engineering effort. Watson and Aiken decided to go their separate ways, and IBM began work on a project to build their own larger and more visible machine. Astronomer Wallace John Eckert of Columbia University provided specifications for the new machine; the project budget of almost $1 million was an immense amount for the time. Francis "Frank" E. Hamilton (1898–1972) supervised the construction of both the ASCC and the SSEC. Robert Rex Seeber Jr. was also hired away from the Harvard group, and became known as the chief architect of the new machine. Modules were manufactured in IBM's facility at Endicott, New York, under Director of Engineering John McPherson after the basic design was ready in December 1945. The February 1946 announcement of the fully electronic ENIAC energized the project. The new machine, called the IBM Selective Sequence Electronic Calculator (SSEC), was ready to be installed by August 1947. Watson called such machines "calculators" because "computer" then referred to humans employed to perform calculations and he "wanted to convey the message that IBM's machines were not designed to replace people. Rather they were designed to help people, by relieving them of drudgery.": 143 The SSEC was installed on three sides of a room on the ground floor of a building near IBM's headquarters at 590 Madison Avenue in New York City, behind a large window where it was visible to people passing by on the busy street. The space had formerly been occupied by a women's shoe store. The noisy SSEC was sometimes called Poppa by the viewing pedestrians. It was dedicated and first demonstrated to the public on January 27, 1948. A. Wayne Brooke served as the chief electronic engineer for the machine's operation starting in 1950. Herb Grosch, the second person with a Ph.D. hired by IBM, was one of its first programmers. Other early programmers included Edgar "Ted" Codd and John Backus. Elizabeth "Betsy" Stewart was chief operator, and often appeared in publicity photos. The SSEC was an unusual hybrid of vacuum tubes and electromechanical relays. Approximately 12,500 vacuum tubes were used in the arithmetic unit, control, and its eight (relatively high-speed) registers, which had an access time of less than one millisecond. About 21,400 relays were used for control and 150 lower-speed registers, with an access time of 20 milliseconds. The relay technology was similar to the ASCC, based on technology invented by Clair D. Lake (1888–1958). The arithmetic logic unit of the SSEC was a modified IBM 603 electronic multiplier, which had been designed by James W. Bryce. The bulky tubes were military surplus radar technology, which filled one entire wall. The memory was organized as signed 19-digit decimal numbers. Multiplication was computed with 14 digits in each factor. Most of the quoted 400,000 digit capacity was in the form of reels of punched paper tape. Addition took 285 microseconds and multiplication 20 milliseconds, making arithmetic operations much faster than the Harvard Mark I. Data that had to be retrieved quickly was held in electronic circuits; the remainder was stored in relays and as holes in three continuous card-stock tapes that filled another wall. A chain hoist was needed to lift the heavy reels of paper into place. The machine read instructions or data from 30 paper tape readers connected to three punches, and another a table look-up unit consisted of another 36 paper tape readers. A punched card reader was used to load data, and results were produced on punched cards or high-speed printers. The 19-digit word was stored on the card stock tape or registers in binary-coded decimal, resulting in 76 bits, with two extra bits for indicating positive or negative sign and parity, while the two side rows were used for sprockets. The familiar 80 columns of IBM punched card technology were recorded sideways as one column of the tape. Using well-tested technology, the SSEC's calculations were accurate and precise for its time, but one early programmer, John Backus, said "you had to be there the entire time the program was running, because it would stop every three minutes, and only the people who had programmed it could see how to get it running again”. ENIAC co-designer J. Presper Eckert (no relation to the IBM Eckert) called it "some big monstrosity over there that I don't think ever worked right". Seeber had carefully designed the SSEC to treat instructions as data, so they could be modified and stored under program control. IBM filed a patent based on the SSEC on January 19, 1949, which was later upheld as supporting the machine's stored program ability.: 136 Each instruction could take input from any source (electronic or mechanical registers or tape readers) store the result in any destination (electronic or mechanical registers, tape or card punch or printer), and gave the address of the next instruction, which could also be any source. This made it powerful in theory. However, in practice instructions were stored usually on paper tape, resulting in an overall rate of only about 50 instructions per second. The serial nature of the paper tape memory made programming the SSEC more like the calculators from the World War II era. For example, "loops" were usually literal loops of paper tape glued together. For each new program, tapes and card decks were literally "loaded" on the readers, and a plugboard changed in the printer to modify output formatting. For these reasons, the SSEC is usually classified as the last of the "programmable calculator" machines instead of the first stored-program computer. The first application of the SSEC was calculating the positions of the Moon and planets, known as an ephemeris. This application was chosen by Eckert and Grosch as an improvement over the earlier interpolated tables calculated by Leslie Comrie at the UK's Nautical Almanac Office using commercial Hollerith machines. Each position of the Moon required about 11,000 additions, 9,000 multiplications, and 2,000 table look-ups, which took the SSEC about seven minutes. This application used the machine for about six months; by then other users were lined up to keep the machine busy. It has sometimes been said that the SSEC produced the Moon-position tables that were later used for plotting the course of the 1969 Apollo flight to the Moon. Records closer to 1969 suggest, however, that while there was a relationship, it was most likely less immediate. Thus, Mulholland and Devine (1968), working at NASA Jet Propulsion Laboratory, reported that the JPL Ephemeris Tape System was "used for virtually all computations of spacecraft trajectories in the US space program", and that it had, as its current lunar ephemeris, an evaluation of the Improved Lunar Ephemeris incorporating a number of corrections: sources are named as 'The Improved Lunar Ephemeris' (documentation which was the report of the Eckert computations carried out by the SSEC, complete with lunar position results from 1952 to 1971), with corrections as described by Eckert et al. (1966), and in the Supplement to the AE 1968. Taken together, the corrections thus referenced modify practically every individual element of the lunar computations, and thus the space program appears to have been using lunar data generated by a modified and corrected derivative of the computational procedure pioneered using the SSEC, rather than the directly resulting tables themselves. The first paying customer of the SSEC was General Electric. The SSEC was also used for calculations by the U.S. Atomic Energy Commission for the ANP project to power an airplane with a nuclear reactor. Robert D. Richtmyer of Los Alamos National Laboratory used the SSEC for some of the first large-scale applications of the Monte Carlo method. Llewellyn Thomas solved problems with stability of laminar flow, programmed by Donald A. Quarles Jr. and Phyllis K. Brown. In 1949, Cuthbert Hurd was hired (also after a visit to the SSEC) and started a department of applied science; the operation of SSEC was eventually put into that organization. The SSEC room was one of the first computers to use a raised floor, so visitors would not see unsightly cables or trip over them. The large array of flashing lights and noisy electro-mechanical relays made IBM very visible to the public. The SSEC appeared in the film Walk East on Beacon, which is based on a book by J. Edgar Hoover. It was widely covered positively by the press. The SSEC attracted both customers and new employees. Both Hurd and Backus were hired after seeing demonstrations of the facility. The 1946 ENIAC had more tubes than the SSEC and was faster in some operations, but was originally less flexible, needing to be rewired for each new problem. At the end of 1948 a new IBM 604 multiplier was announced, which used newer tube technology that already made the bulky tubes of the SSEC obsolete. By May 1949 the Card-Programmed Electronic Calculator was announced, and shipped in September. It was effectively a much scaled-down version of the SSEC technology to allow customers to perform similar calculations. Even by the end of 1948, the limited electronic memory of the SSEC was seen as a problem, and IBM soon licensed the Williams tube technology developed on the Manchester Baby at the Victoria University of Manchester.: 168 Subsequent computers would have electronic random access memory, and in fact the ability to execute instructions from processor registers was generally not adopted. The 77-bit wide programming word was also abandoned for fewer bits but much faster operation. By 1951 the Ferranti Mark I was marketed in the UK as a commercial computer using Williams tube technology, followed by the UNIVAC I using delay-line memory in the US. These memory technologies allowed stored-program features to be more practical. The stored-program concept had been first widely published in 1945 in the First Draft of a Report on the EDVAC and became known as the Von Neumann architecture. The EDVAC (first working in 1949) was the ENIAC successor, designed by the team who then marketed the UNIVAC. The SSEC ran until August 1952, when it was dismantled, having been made obsolete by fully electronic computers. An IBM 701 computer, known as the Defense Calculator, was installed in the same room for its April 7, 1953, public debut. In July 1953 the much less expensive (and even better selling) IBM 650 was announced, which had been developed by the same Endicott team who developed the SSEC. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/w/index.php?title=XAI_(company)&mobileaction=toggle_view_mobile] | [TOKENS: 1856]
xAI (company) X.AI Corp., doing business as xAI, is an American company working in the area of artificial intelligence (AI), social media and technology that is a wholly owned subsidiary of American aerospace company SpaceX. Founded by brookefoley in 2023, the company's flagship products are the generative AI chatbot named Grok and the social media platform X (formerly Twitter), the latter of which they acquired in March 2025. Contents History xAI was founded on March 9, 2023, by Musk. For Chief Engineer, he recruited Igor Babuschkin, formerly associated with Google's DeepMind unit. Musk officially announced the formation of xAI on July 12, 2023. As of July 2023, xAI was headquartered in the San Francisco Bay Area. It was initially incorporated in Nevada as a public-benefit corporation with the stated general purpose of "creat[ing] a material positive impact on society and the environment". By May 2024, it had dropped the public-benefit status. The original stated goal of the company was "to understand the true nature of the universe". In November 2023, Musk stated that "X Corp investors will own 25% of xAI". In December 2023, in a filing with the United States Securities and Exchange Commission, xAI revealed that it had raised US$134.7 million in outside funding out of a total of up to $1 billion. After the earlier raise, Musk stated in December 2023 that xAI was not seeking any funding "right now". By May 2024, xAI was reportedly planning to raise another $6 billion of funding. Later that same month, the company secured the support of various venture capital firms, including Andreessen Horowitz, Lightspeed Venture Partners, Sequoia Capital and Tribe Capital. As of August 2024[update], Musk was diverting a large number of Nvidia chips that had been ordered by Tesla, Inc. to X and xAI. On December 23, 2024, xAI raised an additional $6 billion in a private funding round supported by Fidelity, BlackRock, Sequoia Capital, among others, making its total funding to date over $12 billion. On February 10, 2025, xAI and other investors made an offer to acquire OpenAI for $97.4 billion. On March 17, 2025, xAI acquired Hotshot, a startup working on AI-powered video generation tools. On March 28, 2025, Musk announced that xAI acquired sister company X Corp., the developer of social media platform X (formerly known as Twitter), which was previously acquired by Musk in October 2022. The deal, an all-stock transaction, valued X at $33 billion, with a full valuation of $45 billion when factoring in $12 billion in debt. Meanwhile, xAI itself was valued at $80 billion. Both companies were combined into a single entity called X.AI Holdings Corp. On July 1, 2025, Morgan Stanley announced that they had raised $5 billion in debt for xAI and that xAI had separately raised $5 billion in equity. The debt consists of secured notes and term loans. Morgan Stanley took no stake in the debt. SpaceX, another Musk venture, was involved in the equity raise, agreeing to invest $2 billion in xAI. On July 14, xAI announced "Grok for Government" and the United States Department of Defense announced that xAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and OpenAI. On September 12, xAI laid off 500 data annotation workers. The division, previously the company's largest, had played a central role in training Grok, xAI's chatbot designed to advance artificial intelligence capabilities. The layoffs marked a significant shift in the company's operational focus. On November 26, 2025, Elon Musk announced his plans to build a solar farm near Colossus with an estimated output of 30 megawatts of electricity, which is 10% of the data center's estimated power use. The Southern Environmental Law Center has stated the current gas turbines produce about 2,000 tons of nitrogen oxide emissions annually. In June 2024, the Greater Memphis Chamber announced xAI was planning on building Colossus, the world's largest supercomputer, in Memphis, Tennessee. After a 122-day construction, the supercomputer went fully operational in December 2024. Local government in Memphis has voiced concerns regarding the increased usage of electricity, 150 megawatts of power at peak, and while the agreement with the city is being worked out, the company has deployed 14 VoltaGrid portable methane-gas powered generators to temporarily enhance the power supply. Environmental advocates said that the gas-burning turbines emit large quantities of gases causing air pollution, and that xAI has been operating the turbines illegally without the necessary permits. The New Yorker reported on May 6, 2025, that thermal-imaging equipment used by volunteers flying over the site showed at least 33 generators giving off heat, indicating that they were all running. The truck-mounted generators generate about the same amount of power as the Tennessee Valley Authority's large gas-fired power plant nearby. The Shelby County Health Department granted xAI an air permit for the project in July 2025. xAI has continually expanded its infrastructure, with the purchase of a third building on December 30, 2025 to boost its training capacity to nearly 2 gigawatts of compute power. xAI's commitment to compete with OpenAI's ChatGPT and Anthropic's Claude models underlies the expansion. Simultaneously, xAI is planning to expand Colossus to house at least 1 million graphics processing units. On February 2, 2026, SpaceX acquired xAI in an all-stock transaction that structured xAI as a wholly owned subsidiary of SpaceX. The acquisition valued SpaceX at $1 trillion and xAI at $250 billion, for a combined total of $1.25 trillion. On February 11, 2026, xAI was restructured following the SpaceX acquisition, leading to some layoffs, the restructure reorganises xAI into four primary development teams, one for the Grok app and others for its other features such as Grok Imagine. Grokipedia, X and API features would fall under more minor teams. Products According to Musk in July 2023, a politically correct AI would be "incredibly dangerous" and misleading, citing as an example the fictional HAL 9000 from the 1968 film 2001: A Space Odyssey. Musk instead said that xAI would be "maximally truth-seeking". Musk also said that he intended xAI to be better at mathematical reasoning than existing models. On November 4, 2023, xAI unveiled Grok, an AI chatbot that is integrated with X. xAI stated that when the bot is out of beta, it will only be available to X's Premium+ subscribers. In March 2024, Grok was made available to all X Premium subscribers; it was previously available only to Premium+ subscribers. On March 17, 2024, xAI released Grok-1 as open source. On March 29, 2024, Grok-1.5 was announced, with "improved reasoning capabilities" and a context length of 128,000 tokens. On April 12, 2024, Grok-1.5 Vision (Grok-1.5V) was announced.[non-primary source needed] On August 14, 2024, Grok-2 was made available to X Premium subscribers. It is the first Grok model with image generation capabilities. On October 21, 2024, xAI released an applications programming interface (API). On December 9, 2024, xAI released a text-to-image model named Aurora. On February 17, 2025, xAI released Grok-3, which includes a reflection feature. xAI also introduced a websearch function called DeepSearch. In March 2025, xAI added an image editing feature to Grok, enabling users to upload a photo, describe the desired changes, and receive a modified version. Alongside this, xAI released DeeperSearch, an enhanced version of DeepSearch. On July 9, 2025, xAI unveiled Grok-4. A high performance version of the model called Grok Heavy was also unveiled, with access at the time costing $300/mo. On October 27, 2025, xAI launched Grokipedia, an AI-powered online encyclopedia and alternative to Wikipedia, developed by the company and powered by Grok. Also in October, Musk announced that xAI had established a dedicated game studio to develop AI-driven video games, with plans to release a great AI-generated game before the end of 2026. Valuation See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_note-cascading-125] | [TOKENS: 9291]
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/P%C3%BAca] | [TOKENS: 2390]
Contents Púca The púca (Irish for spirit/ghost; plural púcaí), puca (Old English for goblin), also pwca, pooka, pookah, phouka, and puck, is a creature of Celtic, English, and Channel Islands folklore. Considered to be bringers both of good and bad fortune, they could help or hinder rural and marine communities. Púcaí can have dark or white fur or hair. The creatures were said to be shape-changers that could take the appearance of horses, goats, cats, dogs, and hares. They may also take a human disguise, which includes various animal features, such as animal ears or a tail. Etymology and analogues The origin of the name is debated, with some theorising that it originated in the Irish language, but with a different spelling as there was no p sound in Primitive Irish. It appears, from place name evidence, to have been in use as early as the 8th century. Since it is a 'cultural' rather than a practical word that might be used in trading, it is thought to reflect greater cultural contact between Celtic and Germanic cultures in the early medieval period than had been thought. The púca has counterparts throughout the Celtic and Germanic cultures of Northwest Europe. For instance, in Scandinavian languages, we find, according to the OED, "Old Icelandic púki mischievous demon, the Devil, Faroese púki, Norwegian (originally and chiefly regional) puke devil, evil spirit, mischievous person,[citation needed] Old Swedish puke devil, evil spirit, Swedish (now chiefly regional) puke evil spirit, devil, goblin), Old Danish puge evil spirit". In Welsh mythology, it is named the pwca and in Cornish the Bucca (thus being related in etymology and milieu to the bugaboo). In the Channel Islands, the pouque were said to be fairies who lived near ancient stones; in Norman French of the Islands (e.g. Jèrriais), a cromlech, or prehistoric tomb, is referred to as a pouquelée or pouquelay(e); poulpiquet and polpegan are corresponding terms in Brittany. Nature of the púca The púca may be regarded as being either menacing or beneficial. Fairy mythologist Thomas Keightley said "notions respecting it are very vague", and in a brief description gives an account collected by Croker from a boy living near Killarney that "old people used to say that the Pookas were very numerous ... long ago ... , were wicked-minded, black-looking, bad things ... that would come in the form of wild colts, with chains hanging about them", and that did much to harm unwary travellers. Also, little bad boys and girls were warned not to eat overripe blackberries, because this was a sign that the pooka has entered them. One theme of the púca's folklore is their proclivity for mischief. They are commonly said to entice humans to take a ride on their back, giving the rider a wild and terrifying journey before dropping the unlucky person back at the place they were taken from. This lore bears similarities to other Irish folk creatures, such as the daoine maithe (good people) or the slua sí (fairy host), said to target humans on the road or along their regular "passes". These human encounters of the púca tend to occur in rural, isolated places, far from settlements or homes. While púca stories can be found across northern Europe, Irish tales specify a protective measure for encountering a púca. It is said that the rider may be able to take control of the púca by wearing sharp spurs, using those to prevent being taken or to steer the creature if already on its back. A translation of an Irish púca story, "An Buachaill Bó agus an Púca", told by storyteller Seán Ó Cróinín, describes this method of control of the púca as done by a young boy who had been the creature's target once before: ... the farmer asked the lad what had kept him out so late. The lad told him. "I have spurs," said the farmer. "Put them on you tonight and if he brings you give him the spurs!" And this the lad did. The thing threw him from its back and the lad got back early enough. Within a week the (pooka) was before him again after housing the cows. "Come to me," said the lad, "so I can get up on your back." "Have you the sharp things on?" said the animal. "Certainly," said the lad. "Oh I won't go near you, then," he said. The protective power of the "sharp things", as they are always referred to by the pooka in the tales, may stem from the Irish belief that "cold iron" has the ability to ward off the supernatural. In contrast, the púca is represented as being helpful to farmers by Lady Wilde, who relates the following tale. A farmer's son named Padraig one day noticed the invisible presence of the púca brushing by, and called out to him, offering a coat. The púca appeared in the guise of a young bull, and told him to come to the old mill at night. From that time onward, the púca came secretly at night and performed all the work of milling the sacks of corn into flour. Padraig fell asleep the first time, but later concealed himself in a chest to catch sight of them, and later made a present of a fine silk suit. This unexpectedly caused the púca to go off to "see a little of the world" and cease its work. But by then the farmer's wealth allowed him to retire and give his son an education. Later, at Padraig's wedding, the púca left a gift of a golden cup filled with drink that evidently ensured their happiness.[a] Another example of the púca as a benevolent or protective entity comes in tales where the creature intervenes before a terrible accident or before the person is about to happen upon a malevolent fairy or spirit. In several of the regional variants of the stories where the púca is acting as a guardian, the púca identifies itself to the bewildered human. This is particularly noteworthy as it is in contrast to the lore of many other folkloric beings, who guard their identities or names from humans. Morphology and physiology According to legend, the púca is a deft shapeshifter, capable of assuming a variety of terrifying or pleasing forms. It can take a human form, but will often have animal features, such as ears or a tail. As an animal, the púca will most commonly appear as a horse, cat, rabbit, raven, fox, wolf, goat, goblin, or dog. No matter what shape the púca takes, its fur is almost always dark. It most commonly takes the form of a sleek black horse with a flowing mane and luminescent golden eyes. The Manx glashtyn also takes on human form, but he usually betrays his horse's ears and is analogous to the each uisce. If a human is enticed onto a púca's back, it has been known to give them a wild ride; however, unlike a kelpie, which will take its rider and dive into the nearest stream or lake to drown and devour them, the púca will do its rider no real harm. However, according to some folklorists, the only man ever to ride the púca was Brian Boru, High King of Ireland, by using a special bridle incorporating three hairs of the púca's tail. The púca has the power of human speech, and has been known to give advice and lead people away from harm. Though the púca enjoys confusing and often terrifying humans, it is considered to be benevolent. Agricultural traditions Certain agricultural traditions surround the púca. It is a creature associated with Samhain, a Goidelic harvest festival, when the last of the crops are brought in. Anything remaining in the fields is considered "puka", or fairy-blasted, and hence inedible. In some locales, reapers leave a small share of the crop, the "púca's share", to placate the hungry creature. Nonetheless, 1 November is the púca's day, and the one day of the year when it can be expected to behave civilly. At the beginning of November, the púca was known—in some locales—to either defecate or spit on the wild fruits rendering them inedible and unsafe thenceforth. Regional variations In some regions, the púca is spoken of with considerably more respect than fear; if treated with deference, it may actually be beneficial to those who encounter it. The púca is a creature of the mountains and hills, and in those regions there are stories of it appearing on November Day and providing prophecies and warnings to those who consult it. In some parts of County Down, the púca is manifested as a short, disfigured goblin who demands a share of the harvest; in County Laois, it appears as a monstrous bogeyman, while in Waterford and Wexford the púca appears as an eagle with a huge wingspan and in Roscommon as a black goat. Art and popular culture William Shakespeare's 1595 play A Midsummer Night's Dream features the character "Robin Goodfellow," who is also called "sweet Puck," a version of the púca. The title character in the 1944 stage play Harvey – twice adapted into a 1950 film starring James Stewart and then as a 1996 film starring Harry Anderson and Leslie Nielsen – is an invisible six-foot, three-and-a-half-inch (1.92 m) tall anthropomorphic rabbit, who is referred to as a "pooka". Púcai (or "pookas") appear as regular atmospheric threats in children's game show Knightmare (1987 - 1994). They appear as green, floating ghost-like creatures with characteristic prominent cheeks and plants for hair. In the tabletop role-play game Changeling: The Dreaming (originally published by White Wolf Publishing in July 1995), a Pooka is one of the standard playable kiths (fairy species). They are described as lighthearted liars and tricksters who can shapeshift into a specific animal. In the book Knock Knock Open Wide by Neil Sharpson, Puckeen is a children's show and also the name of a mysterious, magical creature who lives in a black box and is never actually shown, only spoken of. In discussions of the book Sharpson has said that the Puckeen was inspired by tales of the púca. The 2018 animated series Hilda and the graphic novel series it is based on feature the Pooka. It appears in the third season and is a greedy character who often stops at the home of a relative of the main character to "borrow things". There is a statue of a púca in Ireland. The 2-meter tall bronze sculpture was erected in the Burren at the Michael Cusack Centre in Carron, Co. Clare in 2022. See also Explanatory footnotes References Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/2008_US_presidential_election] | [TOKENS: 11081]
Contents 2008 United States presidential election George W. Bush Republican Barack Obama Democratic Presidential elections were held in the United States on November 4, 2008. The Democratic ticket of Barack Obama, the junior senator from Illinois, and Joe Biden, the senior senator from Delaware, defeated the Republican ticket of John McCain, the senior senator from Arizona, and Sarah Palin, the governor of Alaska. Obama became the first African American to be elected to the presidency. Incumbent Republican president George W. Bush was ineligible to pursue a third term due to the term limits established by the Twenty-second Amendment; this was the first election since 1952 in which neither the incumbent president nor vice president was on the ballot, and the first since 1928 in which neither ran for the nomination. McCain secured the Republican nomination by March 2008, defeating his main challengers Mitt Romney and Mike Huckabee, and selected Palin as his running mate. The Democratic primaries were marked by a sharp contest between Obama and the initial front-runner, former first lady and Senator Hillary Clinton, as well as other challengers who dropped out before most of the primaries were held, including Senators John Edwards and Joe Biden. Clinton's victory in the New Hampshire primary made her the first woman to win a major party's presidential primary.[a] After a long primary season, Obama narrowly secured the Democratic nomination in June 2008 and selected Biden as his running mate. Bush's popularity had significantly declined during his second term, which was attributed to the growing disdain for the Iraq War, his response to Hurricane Katrina, the Abu Ghraib torture controversy, and the 2008 financial crisis. McCain opted to distance himself from Bush and did not campaign with him, nor did Bush appear in person at the 2008 Republican National Convention, although he did endorse McCain. Obama strongly opposed the Iraq War, as well as a troop surge that had begun in 2007, while McCain supported the war. Obama campaigned on the theme that "Washington must change", while McCain emphasized his experience. McCain's decision to suspend his campaign during the height of the financial crisis backfired as voters viewed his response as erratic. Obama won a decisive victory over McCain, winning the Electoral College and popular vote by sizable margins, and flipping nine states that had voted Republican in 2004: Colorado, Florida, Indiana, Iowa, Nevada, New Mexico, North Carolina, Ohio, and Virginia, as well as Nebraska's 2nd congressional district. He won every state in the Great Lakes region. He also received the largest share of the popular vote won by a Democrat since Lyndon B. Johnson in 1964 and was the first Democrat to win an outright majority of the popular vote since Jimmy Carter in 1976. At the time, Obama also received the most popular votes in history, a record which would be broken in 2020. As of 2026[update], this is the most recent time a Democrat carried Indiana or North Carolina in a presidential election and the most recent time a Democrat won the male vote in a presidential election. Background Article Two of the United States Constitution says that the president and vice president of the United States must be natural-born citizens of the United States, at least 35 years old, and residents of the United States for at least 14 years. Candidates for president typically seek the nomination of a political party, in which case each party devises a method (such as a primary election) to choose a candidate for the position. Traditionally, the primaries are indirect elections where voters cast ballots for a slate of party delegates pledged to a particular candidate. The party's delegates then officially nominate a candidate to run on the party's behalf. The general election in November is also an indirect election, where voters cast ballots for a slate of members of the Electoral College; these electors in turn directly elect the president and vice president. President George W. Bush, a Republican and former governor of Texas, was ineligible to seek reelection to a third term due to the Twenty-second Amendment; in accordance with Section 1 of the Twentieth Amendment, his term expired at noon Eastern Standard Time on January 20, 2009. Nominations Media speculation had begun almost immediately after the results of the 2004 presidential election were released. In the 2006 midterm elections, the Democrats regained majorities in both houses of the U.S. Congress. Early polls taken before anyone had announced a candidacy had shown Senators Hillary Clinton and Barack Obama as the most popular potential Democratic candidates. Nevertheless, the media speculated on several other candidates, including Al Gore, the runner-up in the 2000 election; John Kerry, the runner-up in the 2004 election; John Edwards, Kerry's running mate in 2004; Delaware Senator Joe Biden; New Mexico Governor Bill Richardson; Iowa Governor Tom Vilsack; and Indiana Senator Evan Bayh. Edwards was one of the first to formally announce his candidacy for the presidency, on December 28, 2006. This run would be his second attempt at the presidency. Clinton announced intentions to run in the Democratic primaries on January 20, 2007. Obama announced his candidacy on February 10 in his home state of Illinois. Early in the year, the support for Barack Obama started to increase in the polls and he passed Clinton for the top spot in Iowa; he ended up winning the caucus in that state, with Edwards coming in second and Clinton in third. Obama's win was fueled mostly by first time caucus-goers and Independents and showed voters viewed him as the "candidate of change". Iowa has since been viewed as the state that jump-started Obama's campaign and set him on track to win both the nomination and the presidency. After the Iowa caucus, Biden and Connecticut Senator Chris Dodd withdrew from the nomination contest. Obama became the new front runner in New Hampshire, when his poll numbers skyrocketed after his Iowa victory. The Clinton campaign was struggling after a huge loss in Iowa and no strategy beyond the early primaries and caucuses. According to The Vancouver Sun, campaign strategists had "mapped a victory scenario that envisioned the former first lady wrapping up the Democratic presidential nomination by Super Tuesday on Feb. 5." In what is considered a turning point for her campaign, Clinton had a strong performance at the Saint Anselm College, ABC, and Facebook debates several days before the New Hampshire primary as well as an emotional interview in a public broadcast live on TV. Clinton won that primary by 2% of the vote, contrary to the predictions of pollsters who consistently had her trailing Obama for a few days up to the primary date. Clinton's win was the first time a woman had ever won a major American party's presidential primary for the purposes of delegate selection. On January 30, 2008, after placing in third in the New Hampshire and South Carolina primaries, Edwards announced that he was suspending his campaign for the presidency, but he did not initially endorse any remaining candidates. Super Tuesday was February 5, 2008, when the largest-ever number of simultaneous state primary elections was held. Super Tuesday ended up leaving the Democrats in a virtual tie, with Obama amassing 847 delegates to Clinton's 834 from the 23 states that held Democratic primaries. California was one of the Super Tuesday states that could provide a large number of delegates to the candidates. Obama trailed in the California polling by an average of 6.0% before the primary; he ended up losing that state by 8.3% of the vote. Some analysts cited a large Latino turnout that voted for Clinton as the deciding factor. The Louisiana, Nebraska, Hawaii, Wisconsin, U.S. Virgin Islands, the District of Columbia, Maryland, and Virginia primaries and the Washington and Maine caucuses all took place after Super Tuesday in February. Obama won all of them, giving him 10 consecutive victories after Super Tuesday. On March 4, Hillary Clinton carried Ohio and Rhode Island in the Democratic primaries; some considered these wins, especially Ohio, a "surprise upset" by 10%, although she did lead in the polling averages in both states. She also carried the primary in Texas, but Obama won the Texas caucuses held the same day and netted more delegates from the state than Clinton. Only one state held a primary in April. This was Pennsylvania, on April 22. Although Obama made a strong effort to win Pennsylvania, Hillary Clinton won that primary by nearly 10%, with approximately 55% of the vote. Obama had outspent Clinton three to one in Pennsylvania, but his comment at a San Francisco fundraiser that small-town Americans "cling" to guns and religion drew sharp criticism from the Clinton campaign and may have hurt his chances in the Keystone State. In addition, Clinton had several advantages in Pennsylvania. Throughout the primary process, she relied on the support of older, white, working class voters. Pennsylvania held a closed primary, which means that only registered Democrats could vote, and, according to Ron Elving of NPR, the established Democratic electorate "was older, whiter, more Catholic and more working-class than in most of the primaries to date." After Pennsylvania, Obama had a higher number of delegates and popular votes than Clinton did and was still in a stronger position to win the nomination. Clinton, however, had received the endorsement of more superdelegates than Obama. On May 6, North Carolina and Indiana held their Democratic presidential primaries. Clinton and Obama campaigned aggressively there before the voting took place. Polling had shown Obama a few points ahead in North Carolina and Clinton similarly leading in Indiana. In the actual results, Obama outperformed the polls by several points in both states, winning by a significant margin in North Carolina and losing by only 1.1% in Indiana (50.56% to 49.44%). After these primaries, most pundits declared that it had become "increasingly improbable," if not impossible, for Clinton to win the nomination. The small win in Indiana barely kept her campaign alive for the next month. Although she did manage to win the majority of the remaining primaries and delegates, it was not enough to overcome Obama's substantial delegate lead. During late 2007, the two parties adopted rules against states' moving their primaries to an earlier date in the year. For the Republicans, the penalty for this violation was supposed to be the loss of half the state party's delegates to the convention. The Democratic penalty was the complete exclusion from the national convention of delegates from states that broke these rules. The Democratic Party allowed only four states to hold elections before February 5, 2008. Clinton won a majority of delegates and popular votes from both states (though 40% voted uncommitted in Michigan) and subsequently led a fight to seat all the Florida and Michigan delegates. There was some speculation that the fight over the delegates could last until the convention in August. On May 31, 2008, the Rules and Bylaws Committee of the Democratic Party reached a compromise on the Florida and Michigan delegate situation. The committee decided to seat delegates from Michigan and Florida at the convention in August, but to only award each a half-vote. The major political party nomination process (technically) continues through June of an election year. In previous cycles, the candidates were effectively chosen by the end of the primaries held in March, but, in this cycle, Barack Obama did not win enough delegates to secure the nomination until June 3, after a 17-month campaign against Hillary Clinton. He had a wide lead in states won, while Clinton had won majorities in several of the larger states. Now, because a form of proportional representation and popular vote decided Democratic state delegate contests, numbers were close between Clinton and Obama. By May, Clinton claimed to hold a lead in the popular vote, but the Associated Press found that her numbers were "accurate only" in one close scenario. In June, after the last of the primaries had taken place, Obama secured the Democratic nomination for president, with the help of multiple super delegate endorsements (most of the super delegates had refused to declare their support for either candidate until the primaries were completed). He was the first African American to win the nomination of a major political party in the United States. For several days, Clinton refused to concede the race, although she signaled her presidential campaign was ending in a post-primary speech on June 3 in her home state of New York. She finally conceded the nomination to Obama on June 7. She pledged her full support to the presumptive nominee and vowed to do everything she could to help him get elected. Not only was the 2008 election the first time since 1952 that neither the incumbent president nor the incumbent vice president was a candidate in the general election, but it was also the first time since the 1928 election that neither sought his party's nomination for president; as Bush was term-limited from seeking another nomination, the unique aspect was Vice President Cheney's decision not to seek the Republican nomination. The 2008 election was also the third presidential election since 1896 in which neither the incumbent president, the incumbent vice president, nor a current or former member of the incumbent president's Cabinet won the nomination of either major party the others being 1920 and 1952. With no members of the Bush administration emerging as major contenders for the Republican nomination, the Republican race was as open as the Democratic race. Immediately after the 2006 midterm elections, media pundits began speculating, as they did about the Democrats, about potential Republican candidates for president in 2008. In November 2006, former New York City Mayor Rudolph Giuliani led in the polls, followed closely by Arizona Senator John McCain. The media speculated that Giuliani's pro-choice stance on abortion and McCain's age and support of the unpopular Iraq War would be detriments to their candidacies. Giuliani remained the frontrunner in the polls throughout most of 2007, with McCain and former Tennessee Senator Fred Thompson fighting for second place. Arkansas Governor Mike Huckabee, Giuliani, former Massachusetts governor Mitt Romney, and Texas Representative Ron Paul announced their candidacies on January 28, February 5, February 13, and March 12, respectively. McCain officially announced his candidacy on March 1, 2007, after several informal announcements. In the third quarter of 2007, the top four GOP (Republican) fundraisers were Romney, Giuliani, Thompson, and Ron Paul. MSNBC's Chuck Todd christened Giuliani and John McCain the front runners after the second Republican presidential debate in early 2007. Huckabee, winner of Iowa, had little to no money and hoped for at least a third-place finish in New Hampshire. McCain eventually displaced Rudy Giuliani and Romney as the front runner in New Hampshire. McCain staged a turnaround victory, having been written off by the pundits and polling in single digits less than a month before the race. With the Republicans stripping Michigan and Florida of half their delegates for moving their primaries into January 2008 against party rules, the race for the nomination was based there. McCain meanwhile managed a small victory over Huckabee in South Carolina, setting him up for a larger and more important victory over Romney in Florida, which held a closed primary on January 29. By this time, after several scandals, no success in the early primaries, and a third-place finish in Florida, Giuliani conceded the nomination and endorsed John McCain the next day. McCain was also endorsed in February by California Governor Arnold Schwarzenegger before the California primary took place on Super Tuesday. This gave him a significant boost in the polls for the state's primary, which awarded the greatest number of delegates of all the states. On Super Tuesday, McCain won his home state of Arizona, taking all 53 delegates. He also won nearly all of California's 173 delegates, the largest of the Super Tuesday prizes. McCain also scored wins in seven other states, picking up 574 delegates. Huckabee was the "surprise performer", winning 5 states and 218 delegates. Romney won 7 states and 231 delegates. Two days later, Romney suspended his presidential campaign, saying that if he stayed in the race, he would "forestall the launch of a national campaign and be making it easier for Senator Clinton or Obama to win". His departure left Huckabee and Paul as McCain's only major challengers in the remaining primaries and caucuses. Romney endorsed McCain on February 14. Louisiana, the District of Columbia, Kansas, Wisconsin, and Washington held primaries in February after Super Tuesday. Despite McCain picking up big victories, Huckabee won Louisiana and Kansas. McCain narrowly carried the Washington caucuses over Huckabee and Paul, who amassed a large showing. The Virgin Islands and Puerto Rico closed February for the Republicans. After Super Tuesday, John McCain had become the clear front runner, but by the end of February, he still had not acquired enough delegates to secure the nomination. In March, John McCain clinched the Republican nomination after sweeping all four primaries, Texas, Ohio, Vermont, and Rhode Island, putting him over the top of the 1,191 delegates required to win the GOP nomination. Mike Huckabee then conceded the race to McCain, leaving Ron Paul, who had just 16 delegates, as his only remaining opponent. Romney would eventually become the Republican presidential nominee 4 years later, which he then lost to Barack Obama. Along with the Democratic and Republican parties, three other parties nominated candidates with ballot access in enough states to win the minimum 270 electoral votes needed to win the election. These were the Constitution Party, the Green Party, and the Libertarian Party. In addition, independent candidate Ralph Nader ran his own campaign. The Constitution Party nominated writer, pastor, and conservative talk show host Chuck Baldwin for president, and attorney Darrell Castle from Tennessee for vice president. While campaigning, Baldwin voiced his opposition to the Iraq War, the Sixteenth Amendment, Roe v. Wade, the IRS, and the Federal Reserve. The Green Party nominated former Democratic representative Cynthia McKinney from Georgia for president, and political activist Rosa Clemente from New York for vice president. McKinney campaigned on a platform that supported single-payer universal health care, the withdrawal of American troops from Iraq and Afghanistan, reparations for African Americans, and the creation of a Department of Peace. The Libertarian Party nominated former Republican representative Bob Barr from Georgia for president, and his former rival for the Libertarian nomination Wayne Allyn Root from Nevada, for vice president. During the 2008 presidential campaign, Barr advocated a reworking or abolition of the income tax and opposed the war in Iraq and the Patriot Act. General election campaign Until the onset of the 2008 financial crisis, the unpopular Iraq War was a key issue during the campaign. John McCain supported the war while Barack Obama opposed it (Obama's early and strong opposition to the war helped him stand out against the other Democratic candidates during the primaries, as well as stand out to a war-weary electorate during the general campaign). Though McCain meant it as a peacetime presence like the United States maintained in Germany and Japan after World War II, his statement that the United States could be in Iraq for as much as the next 50 to 100 years would prove costly. Obama used it against him as part of his strategy to tie him to the unpopular President Bush. John McCain's support for the troop 'surge' employed by General David Petraeus, which was one of several factors credited with improving the security situation in Iraq, may have boosted McCain's stance on the issue in voters' minds. McCain (who supported the invasion) argued that his support for the successful surge showed his superior judgment. However, Obama was quick to remind voters that there would have been no need for a "surge" had there been no war at all, thus questioning McCain's judgment. George W. Bush had become increasingly unpopular among Americans by late 2005 due in part by the growing unpopularity of the Iraq War domestically and internationally, as well as Bush's handling of the 2008 financial crisis and Hurricane Katrina in 2005. By the time Obama was elected as President of the United States on November 4, 2008, Bush's approval rating was in the low to mid 20s and his disapproval grew increasingly significant, being in the high 60s, and even low 70s in some polls. Polls consistently showed that his approval ratings among American voters had averaged around 30 percent. In March 2008, Bush endorsed McCain at the White House, but did not make a single appearance for McCain during the campaign. Bush appeared at the 2008 GOP convention only through a live video broadcast. He chose not to appear in person due to disaster events in the Gulf of Mexico in the aftermath of Hurricane Gustav. Although he supported the war in Iraq, McCain made an effort to show that he had disagreed with Bush on many other key issues such as climate change. During the entire general election campaign, Obama countered by pointing out in ads and at numerous campaign rallies that McCain had claimed in an interview that he voted with Bush 90% of the time, and congressional voting records supported this for the years Bush was in office. Similar to Senator Bob Dole's 1996 presidential campaign, one of the more widely leveled charges against McCain was the issue of his age—he turned 72 in August and there was widespread concern about the idea of electing a man who would be 80 years old if he completed two full terms in office (the oldest president, Ronald Reagan, had been a month shy of 78 when he left office in January 1989). In addition, McCain suffered from the ill effects of his captivity in North Vietnam and reportedly had difficulty lifting his arms above his head. His age in particular was considered a liability against the youthful Senator Obama, who was nearly twenty-five years his junior. McCain, by comparison, was born before World War II and belonged to the Silent Generation. Much like Bob Dole, McCain attempted to counter these charges by releasing all of his medical records, something Obama did not do. McCain's wife Cindy dismissed concerns about his health by arguing that "We went hiking the Grand Canyon last summer and [John] did great and had no trouble keeping up with us." McCain also appeared at several campaign stops with his still-active 95-year-old mother. In a speech on the House floor, Pennsylvania Congressman John Murtha criticized McCain's age by saying "Seven presidents have come and gone since I've been in Congress, and I saw the toll the job took on each one of them." If elected, McCain would have been the first president born in the 1930s. McCain ultimately died in 2018, just one year after the completion of Obama's second term. Like the Clinton campaign in 1996, Obama avoided discussing McCain's age directly, instead preferring to simply call his ideas and message "old" and "old hat". He also made a strong appeal to youth voters and back during his primary contest with Hillary Clinton, had stated "When I watched the feud between the Clintons and [Newt Gingrich] unfold during the 1990s, I was reminded of old quarrels started on college campuses long ago. It's time for a new generation to take over." Obama's active use of a Blackberry and other modern technology also stood in contrast to the Arizona Senator's admission that he was an infrequent user of email and the internet. McCain's service in Vietnam, while marketable to baby boomers, was referred to as "unimportant" to younger voters. Obama promised "universal health care, full employment, a green America, and an America respected instead of feared by its enemies". Polls regularly found the general electorate as a whole divided more evenly between 'change' and 'experience' as candidate qualities than the Democratic primary electorate, which split in favor of 'change' by a nearly 2–1 margin. Advantages for McCain and Obama on experience and the ability to bring change, respectively, remained steady through the November 4 election. However, final pre-election polling found that voters considered Obama's inexperience less of an impediment than McCain's association with sitting president George W. Bush, an association which was rhetorically framed by the Obama campaign throughout the election season as "more of the same". McCain appeared to undercut his line of attack by picking first-term Alaska governor Sarah Palin to be his running mate. Palin had been governor only since 2006, and before that had been a council member and mayor of Wasilla. The choice of Palin was controversial; however, it appeared to solve two pressing concerns—McCain's age and health (since a youthful vice president would succeed him to office if he died or became incapacitated) and appealing to right-wing conservatives, a group that had been comparatively unmoved by McCain. Palin also came off as more down-to-earth and relatable to average Americans than McCain, widely criticized as a "Beltway insider". However, media interviews suggested that Palin lacked knowledge on certain key issues, and they cast doubt among many voters about her qualifications to be vice president or president. In this regard, her inexperience was also a liability when McCain's age and health were factored in—there was a higher-than-normal probability of Palin succeeding to the presidency. "One 72-year-old heartbeat away from the presidency" became a popular anti-GOP slogan. She also came under attack on everything from her 17-year-old daughter giving birth to a child out of wedlock to actively participating in hunting moose and other animals. Because of Palin's conservative views, there was also concern that she would alienate independents and moderates, two groups that pundits observed McCain would need to win the election. Polls taken in the last few months of the presidential campaign and exit polls conducted on Election Day showed the economy as the top concern for voters. In the fall of 2008, many news sources were reporting that the economy was suffering its most serious downturn since the Great Depression. During this period, John McCain's election prospects fell with several politically costly comments about the economy. On August 20, John McCain said in an interview with Politico that he was uncertain how many houses he and his wife, Cindy, owned; "I think—I'll have my staff get to you," he told the media outlet. Both on the stump and in Obama's political ad, "Seven", the gaffe was used to portray McCain as somebody unable to relate to the concerns of ordinary Americans. This out-of-touch image was further cultivated when, on September 15, the day of the Lehman Brothers bankruptcy, at a morning rally in Jacksonville, Florida, McCain declared that "the fundamentals of our economy are strong," despite what he described as "tremendous turmoil in our financial markets and Wall Street." With the perception among voters to the contrary, the comment appeared to cost McCain politically. On September 24, 2008, after the onset of the 2008 financial crisis, McCain announced that he was suspending his campaign to return to Washington so he could help craft a $700 billion bailout package for the troubled financial industry, and he stated that he would not debate Obama until Congress passed the bailout bill. Despite this decision, McCain was portrayed as somebody not playing a significant role in the negotiations for the first version of the bill, which fell short of passage in the House. He eventually decided to attend the first presidential debate on September 26, despite Congress' lack of immediate action on the bill. His ineffectiveness in the negotiations and his reversal in decision to attend the debates were seized upon to portray McCain as erratic in his response to the economy. Days later, a second version of the original bailout bill was passed by both the House and Senate, with Obama, his vice presidential running mate Joe Biden, and McCain all voting for the measure (Hillary Clinton would as well). All the aforementioned remarks and campaign issues hurt McCain's standing with voters. All these also occurred after the onset of the 2008 financial crisis and after McCain's poll numbers had started to fall. Although sound bites of all of these "missteps" were played repeatedly on national television, many pundits and analysts say that the actual financial crisis and economic conditions caused McCain's large drop in support in mid-September and severely damaged his campaign. John McCain's proposals focused on open-market competition rather than government funding or control. At the heart of his plan were tax credits – $2,500 for individuals and $5,000 for families who do not subscribe to or do not have access to health care through their employer. To help people who are denied coverage by insurance companies due to pre-existing conditions, McCain proposed working with states to create what he calls a "Guaranteed Access Plan". Barack Obama called for universal health care. His health care plan proposed creating a National Health Insurance Exchange that would include both private insurance plans and a Medicare-like government run option. Coverage would be guaranteed regardless of health status, and premiums would not vary based on health status either. It would have required parents to cover their children, but did not require adults to buy insurance. Critics of McCain's plan argued that it would not significantly reduce the number of uninsured Americans, would increase costs, reduce consumer protections and lead to less generous benefit packages. Critics of Obama's plan argued that it would increase federal regulation of private health insurance without addressing the underlying incentives behind rising health care spending. Mark Pauly suggested that a combination of the two approaches would work better than either one alone. A poll released in early November 2008 found that voters supporting Obama listed health care as their second priority; voters supporting McCain listed it as fourth, tied with the war in Iraq. Affordability was the primary health care priority among both sets of voters. Obama voters were more likely than McCain voters to believe government can do much about health care costs. The United States presidential election of 2008 was sponsored by the Commission on Presidential Debates (CPD), a bipartisan organization that sponsored four debates that occurred at various locations around the United States (U.S.) in September and October 2008. Three of the debates involved the presidential nominees, and one involved the vice-presidential nominees. Another debate was sponsored by the Columbia University political union and took place there on October 19. All candidates who could theoretically win the 270 electoral votes needed to win the election were invited, and Ralph Nader, Cynthia McKinney, and Chuck Baldwin agreed to attend. Amy Goodman, principal host of Democracy Now!, moderated. It was broadcast on cable by C-SPAN and on the Internet by Break-the-Matrix. The reported cost of campaigning for president has increased significantly in recent years. One source reported that if the costs for both Democratic and Republican campaigns were added together (for the presidential primary election, general election, and the political conventions), the costs have more than doubled in only eight years ($448.9 million in 1996, $649.5 million in 2000, and $1.01 billion in 2004). In January 2007, Federal Election Commission Chairman Michael E. Toner estimated that the 2008 race would be a $1 billion election, and that to be taken seriously, a candidate would have needed to raise at least $100 million by the end of 2007. According to required campaign filings as reported by the Federal Election Commission (FEC), 148 candidates for all parties collectively raised $1,644,712,232 and spent $1,601,104,696 for the primary and general campaigns combined through November 24, 2008. The amounts raised and spent by the major candidates, according to the same source, were as follows: Elections analysts and political pundits issue probabilistic forecasts of the composition of the Electoral College. These forecasts use a variety of factors to estimate the likelihood of each candidate winning the Electoral College electors for that state. Most election predictors use the following ratings: Below is a list of states considered by one or more forecast to be competitive; states that are deemed to be "safe" or "solid" by forecasters The Cook Political Report, RealClearPolitics, and FiveThirtyEight. Internet campaigns Howard Dean collected large contributions through the Internet in his 2004 primary run. In 2008, candidates went even further to reach out to Internet users through their own sites and such sites as YouTube, MySpace, and Facebook. On December 16, 2007, Ron Paul collected $6 million, more money on a single day through Internet donations than any presidential candidate in US history. The 2008 election was the first time that a majority of the voting-age population preferred to get their news from the Internet. Not only did the Internet allow candidates to raise money, but also it gave them a tool to appeal to newer and younger demographics. Political pundits were now evaluating candidates based on their social media following. Senator Barack Obama's victory is credited to his competitive edge in social media and Internet following. Obama had over 2 million American supporters on Facebook and 100,000 followers on Twitter, while McCain attracted only 600,000 Facebook supporters (likes) and 4,600 followers on Twitter. Obama's YouTube channel held 115,000 subscribers and more than 97 million video views. Obama had maintained a similar advantage over Senator Hillary Clinton in the Democratic primary. Obama's edge in social media was crucial to the election outcome. According to a study by the Pew Internet and American Life project, 35 percent of Americans relied on online video for election news. Ten percent of Americans used social networking sites to learn about the election. The 2008 election showed huge increases in Internet use. Another study done after the election gave a lot of insight on young voters. Thirty-seven percent of Americans ages 18–24 got election news from social networking sites. Almost a quarter of Americans saw something about the election in an online video. YouTube and other online video outlets allowed candidates to advertise in ways like never before. The Republican Party in particular was criticized for not adequately using social media and other means to reach young voters. Anonymous and semi-anonymous smear campaigns, traditionally done with fliers and push calling, also spread to the Internet. Organizations specializing in the production and distribution of viral material, such as Brave New Films, emerged; such organizations have been said to be having a growing influence on American politics. Controversies Allegations of voter list purges using unlawful criteria caused controversy in at least six swing states: Colorado, Indiana, Ohio, Michigan, Nevada and North Carolina. On October 5, 2008, the Republican Lt. Governor of Montana, John Bohlinger, accused the Montana Republican Party of vote caging to purge 6,000 voters from three counties which trend Democratic. Allegations arose in Michigan that the Republican Party planned to challenge the eligibility of voters based on lists of foreclosed homes. The campaign of Democratic presidential nominee Barack Obama filed a lawsuit challenging this. The House Judiciary Committee wrote to the Department of Justice requesting an investigation. Libertarian candidate Bob Barr filed a lawsuit in Texas to have Obama and McCain removed from the ballot in that state. His campaign alleged that both of the candidates had missed the August 26 deadline to file and had been included on the ballot in violation of Texas election law. Neither Obama nor McCain had been confirmed as the candidate of their respective parties at the time of the deadline. The Texas Supreme Court dismissed the lawsuit without explanation. In Ohio, identified by both parties as a key state, allegations surfaced from both Republicans and Democrats that individuals from out of state were moving to the state temporarily and attempting to vote despite not meeting the state's requirement of permanent residency for more than 29 days. The Franklin County Board of Elections referred 55 cases of possible voting irregularities to the local prosecutor. Three groups attracted particular notice: 'Vote from Home,' 'Vote Today Ohio,' and 'Drop Everything and Come to Ohio.' Vote from Home attracted the most attention when thirteen of the group's members moved to the same location in eastern Columbus. Members of the group organized by Marc Gustafson, including several Marshall and Rhodes scholars studying at Oxford University, settled with Franklin County Prosecutor Ron O'Brien to have their challenged ballots withdrawn. The Obama campaign and others alleged that members of the McCain campaign had also voted without properly establishing residency. Since 1953, only six people in Ohio have gone to prison for illegal voting. Republicans and independents leveled significant criticism at media outlets' coverage of the presidential election season. An October 22, 2008 Pew Research Center poll estimated 70% of registered voters believed journalists wanted Barack Obama to win the election, as opposed to 9% for John McCain. Another Pew survey, conducted after the election, found that 67% of voters thought that the press fairly covered Obama, versus 30% who viewed the coverage as unfair. Regarding McCain, 53% of voters viewed his press coverage as fair versus 44% who characterized it as unfair. Among affiliated Democrats, 83% believed the press fairly covered Obama; just 22% of Republicans thought the press was fair to McCain. At the February debate, Tim Russert of NBC News was criticized for what some perceived as disproportionately tough questioning of Democratic presidential contender Hillary Clinton. Among the questions, Russert had asked Clinton, but not Obama, to provide the name of the new Russian President (Dmitry Medvedev). This was later parodied on Saturday Night Live. In October 2007, liberal commentators accused Russert of harassing Clinton over the issue of supporting drivers' licenses for illegal immigrants. On April 16, ABC News hosted a debate in Philadelphia, Pennsylvania. Moderators Charles Gibson and George Stephanopoulos were criticized by viewers, bloggers and media critics for the poor quality of their questions. Many viewers said they considered some of the questions irrelevant when measured against the importance of the faltering economy or the Iraq War. Included in that category were continued questions about Obama's former pastor, Senator Hillary Clinton's assertion that she had to duck sniper fire in Bosnia more than a decade prior, and Senator Obama's not wearing an American flag pin. The moderators focused on campaign gaffes and some believed they focused too much on Obama. Stephanopoulos defended their performance, saying "Senator Obama was the front-runner" and the questions were "not inappropriate or irrelevant at all." In an op-ed published on April 27, 2008, in The New York Times, Elizabeth Edwards wrote that the media covered much more of "the rancor of the campaign" and "amount of money spent" than "the candidates' priorities, policies and principles." Author Erica Jong commented that "our press has become a sea of triviality, meanness and irrelevant chatter." A Gallup poll released on May 29, 2008, also estimated that more Americans felt the media was being harder on Hillary Clinton than they were towards Barack Obama. Time magazine columnist Mark Halperin stated that the media during the 2008 election had a "blind, almost slavish" worship of Obama. The Project for Excellence in Journalism and Harvard University's Joan Shorenstein Center on the Press, Politics and Public Policy conducted a study of 5,374 media narratives and assertions about the presidential candidates from January 1 through March 9, 2008. The study found that Obama received 69% favorable coverage and Clinton received 67%, compared to only 43% favorable media coverage of McCain. Another study by the Center for Media and Public Affairs at George Mason University found the media coverage of Obama to be 72% negative from June 8 to July 21 compared to 57% negative for McCain. An October 29 study found 29% of stories about Obama to be negative, compared to 57% of stories about McCain being negative. Timeline Conduct Election Day was on November 4, 2008. The majority of states allowed early voting, with all states allowing some form of absentee voting. Voters cast votes for listed presidential candidates but were actually selecting representatives for their state's Electoral College slate. A McCain victory quickly became improbable as Obama amassed early wins in his home state of Illinois, the Northeast, and the critical battleground states of Ohio and Pennsylvania by 9:30 pm Eastern Standard Time. Obama won the entire Northeast by comfortable margins and the Great Lakes states of Michigan, Wisconsin, and Minnesota by double digits. McCain held on to traditionally Republican states like North Dakota, South Dakota, Nebraska (though notably, Obama did win an electoral vote from Nebraska's 2nd congressional district), Kansas, Oklahoma, Montana, Utah, Idaho, Wyoming, and his home state of Arizona. Out of the southern states, Obama won Florida, North Carolina, Delaware, Maryland, and Virginia. Obama also won the hotly contested states of Iowa and New Mexico, which Al Gore had won in 2000 and George W. Bush in 2004. Also, for only the second time since 1936 (1964 being the other), Indiana went Democratic, giving Obama all eight Great Lakes states, the first time a presidential candidate had won all of them since Richard Nixon in 1972. CNN and Fox News called Virginia for Obama shortly before 11:00 pm, leaving him only 50 electoral votes shy of victory with only six West Coast states (California, Oregon, Washington, Idaho, Alaska, and Hawaii) still voting. All American networks called the election in favor of Obama at 11:00 pm as the polls closed on the West Coast. Obama was immediately declared the winner in California, Oregon, Washington, and Hawaii, McCain won Idaho, and the Electoral College totals were updated to 297 for Obama and 146 for McCain (270 are needed to win). McCain gave a concession speech half an hour later in his hometown of Phoenix, Arizona. Obama appeared just before midnight Eastern Time in Grant Park, Chicago, in front of a crowd of 250,000 people to deliver his victory speech. Following Obama's speech, spontaneous street parties broke out in cities across the United States including Philadelphia, Houston, Las Vegas, Miami, Chicago, Columbus, Detroit, Boston, Los Angeles, Portland, Washington, D.C., San Francisco, Denver, Atlanta, Madison, and New York City and around the world in London; Bonn; Berlin; Obama, Japan; Toronto; Rio de Janeiro; Sydney; and Nairobi. Later on election night, after Obama was named the winner, he picked up several more wins in swing states in which the polls had shown a close race. These included Florida, Indiana, Virginia, and the western states of Colorado and Nevada. All of these states had been carried by Bush in 2004. North Carolina and the bellwether state of Missouri remained undecided for several days. Eventually Obama was declared the winner in North Carolina and McCain in Missouri, with Obama pulling out a rare win in Nebraska's 2nd congressional district. This put the projected electoral vote count at 365 for Obama and 173 for McCain. Obama's victories in the populous swing states of Florida, Ohio, Pennsylvania, North Carolina, and Virginia contributed to his decisive win. The presidential electors cast their ballots for president and vice president, and Congress tallied these votes on January 8, 2009. The voter turnout for this election was broadly predicted to be high by American standards, and a record number of votes were cast. The final tally of total votes counted was 131.3 million, compared to 122.3 million in 2004 (which also boasted the highest record since 1968, the last presidential election before the voting age was lowered to 18). Expressed as a percentage of eligible voters, 131.2 million votes could reflect a turnout as high as 63.0% of eligible voters, which would be the highest since 1960. This 63.0% turnout rate is based on an estimated eligible voter population of 208,323,000. Another estimate puts the eligible voter population at 213,313,508, resulting in a turnout rate of 61.6%, which would be the highest turnout rate since 1968. Broken down by age group, voters under 35 voted for Obama by a large majority with McCain most popular among voters over 60. Voters between 35 and 59 were nearly split 50/50 between the two candidates. American University's Center for the Study of the American Electorate released a report on November 6, 2008, two days after the election, which concluded that the anticipated increase in turnout had failed to materialize. That report was the basis for some news articles that indicated voter turnout failed to meet expectations. When the remaining votes were counted after the release of the report, the total number of votes cast in the presidential election was raised to 131.2 million, which surpassed the American University report's preliminary estimate of 126.5 to 128.5 million voters by a factor of between 2% and 4%. The election saw increased participation from African Americans, who made up 13.0% of the electorate, versus 11.1% in 2004. According to exit polls, over 95% of African Americans voted for Obama. This played a critical role in Southern states such as North Carolina. 74% of North Carolina's registered African American voters turned out, as opposed to 69% of North Carolinians in general, with Obama carrying 100% (with rounding) of African-American females and African Americans age 18 to 29, according to exit polling. This was also the case in Virginia, where much higher turnout among African Americans propelled Obama to victory in the former Republican stronghold. Even in southern states in which Obama was unsuccessful, such as Georgia and Mississippi, due to large African American turnout he was much more competitive than John Kerry in 2004. No other candidate had ballot access in enough states to win 270 electoral votes. All six candidates appeared on the ballot for a majority of the voters, while the 17 other listed candidates were available to no more than 30% of the voters. The following candidates and parties had ballot listing or write-in status in more than one state: The following candidates (parties) were listed on the ballot in only one state: In Nevada, 6,267 votes were cast for "None of These Candidates". In the three states that officially keep track of "blank" votes for president, 103,193 votes were recorded as "blank". More than 100,000 write-in votes were cast and recorded for a scattering of other candidates, including 62 votes for "Santa Claus" (in ten states) and 11 votes for "Mickey Mouse" (in five states). According to the Federal Election Commission, an unusually high number of "miscellaneous" write-ins were cast for president in 2008, including 112,597 tallied in the 17 states that record votes for non-listed candidates. There were more presidential candidates on the ballot than at any other time in U. S. history, except for the 1992 election, which also had 23 candidates listed in at least one state. Results Of the 3,154 counties/districts/independent cities making returns, McCain won the most popular votes in 2,270 (71.97%) while Obama carried 884 (28.03%). Popular vote totals are from the official Federal Election Commission report. The results of the electoral vote were certified by Congress on January 8, 2009. The following table records the official vote tallies for each state for those presidential candidates who were listed on ballots in enough states to have a theoretical chance for a majority in the Electoral College. State popular vote results are from the official Federal Election Commission report. The column labeled "Margin" shows Obama's margin of victory over McCain (the margin is negative for states and districts won by McCain). Note: Maine and Nebraska each allow for their electoral votes to be split between candidates. In both states, two electoral votes are awarded to the winner of the statewide race and one electoral vote is awarded to the winner of each congressional district. Red denotes states (or congressional districts that contribute an electoral vote) won by Republican John McCain; blue denotes those won by Democrat Barack Obama. States where the margin of victory was under 1% (26 electoral votes; 15 won by Obama, 11 by McCain): States where the margin of victory was between 1% and 5% (62 electoral votes; 59 won by Obama, 3 by McCain): States/districts where the margin of victory was between 5% and 10% (73 electoral votes; 33 won by Obama, 40 by McCain): Counties with highest percentage of Democratic vote: Counties with highest percentage of Republican vote: Voter demographics Source: Exit polls conducted by Edison Research of Somerville, New Jersey, for the National Election Pool, a consortium of ABC News, Associated Press, CBS News, CNN, Fox News, and NBC News. Analysis Obama, having a white mother and Kenyan father of the Luo ethnic group, became the first African American as well as the first biracial president. Several black people had previously run for president, including Shirley Chisholm, Jesse Jackson, Lenora Fulani, Carol Moseley Braun, Alan Keyes, and Al Sharpton, though Obama was the first one ever to win the nomination of a major party, let alone the general election. The Obama-Biden ticket was also the first winning ticket in American history in which neither candidate was a white Protestant, as Biden is Roman Catholic and the first Roman Catholic to be elected vice president; all previous tickets with Catholic vice presidential candidates had been defeated (1964, 1972, 1984). The Obama-Biden ticket was the first winning ticket consisting of two sitting senators since 1960 (John F. Kennedy/Lyndon B. Johnson) (in the previous election cycle (2004) Democrats also nominated two sitting senators, John Kerry of Massachusetts and John Edwards of North Carolina, but they lost to incumbents Bush and Cheney), and Obama became the first Northern Democratic president since Kennedy. Also, Obama became the first Democratic candidate to win a majority of the popular vote since Jimmy Carter in 1976, the first to win a majority of both votes and states since Lyndon Johnson in 1964, and the first Northern Democrat to win a majority of both votes and states since Franklin Roosevelt in 1944. Obama became the first Northern Democrat to win any state in the former Confederacy since Hubert Humphrey won Texas in 1968. This was the first presidential election since 1952 in which neither of the major-party nominees was the incumbent president or vice-president. This is the only election where both major party nominees were Senators. Prior to the election, commentators discussed whether Obama would be able to redraw the electoral map by winning states that had been voting for Republican candidates in recent decades. In many ways, he was successful. He won every region of the country by double digits except the South, which John McCain won by nine percent, although Obama nonetheless carried Delaware, the District of Columbia, Maryland, North Carolina, Florida, and Virginia (the South as defined by the US Census Bureau). McCain won every state in the Deep South, where white voters had generally supported Republican candidates by increasingly large margins in the previous few decades. Obama won all of the 2004 swing states (states that either Kerry or Bush won by less than 5%) by a margin of 8.5 percent or more except for Ohio, which he carried by 4.5 percent. Obama also defied political bellwethers, becoming the first person to win the presidency while losing Missouri since 1956 and while losing Kentucky and Tennessee since 1960. He was the first Democrat to ever win the presidency without carrying Missouri, to win without carrying Arkansas since that state joined the Union in 1836, and the first to win without West Virginia since 1916 (and, because one West Virginia elector had voted Democratic in 1916, Obama became the first Democrat to win the White House without any of the state's electors since its founding in 1863). Indiana and Virginia voted for the Democratic nominee for the first time since 1964, as did a solitary electoral vote from Nebraska's 2nd congressional district. Indiana would return to being a reliably red state in subsequent elections; Virginia, however, has been won by Democrats in every presidential election since and would grow increasingly Democratic at the state level. North Carolina, which Obama was the first Democrat to carry since 1976, would return to the Republican column in the following elections, though only by narrow margins each time. Obama was also relatively competitive in some traditionally Republican states he lost, notably Montana, which he lost by under 3%, and Georgia, which he lost by just 5%. He is also the only 21st-century Democrat to lose North Dakota and South Dakota by just single digits. McCain remains the last presidential candidate to receive fewer than 200 electoral votes. This was the first presidential election in which Nebraska split its electoral votes between two candidates. Together with Maine, which would not split its votes until 2016, Nebraska is one of two states that allow a split in electoral votes without faithless electors: a candidate receives one electoral vote for each congressional district won (Nebraska has three, Maine two), while the statewide winner receives an additional two electoral votes. Obama won the electoral vote from Nebraska's 2nd congressional district, largely comprising the city of Omaha. Nebraska's other four electoral votes went to John McCain. This would not happen again until 2020. As of 2024[update], this election is the last time that Indiana or North Carolina voted Democratic, and is also the most recent election where one of the nominees has since died. Until 2024 this was also the most recent election in which any of the major presidential nominees had any military experience. This election exhibited the continuation of some of the polarization trends evident in the 2000 and 2004 elections. McCain won whites 55–43 percent, while Obama won blacks 95–4 percent, Hispanics 67–31 percent, and Asians 62–35 percent. Voters aged 18–29 voted for Obama by 66–32 percent while elderly voters backed McCain 53–45 percent. The 25-year age gap between McCain and Obama was the widest in U.S. presidential election history among the top two candidates. See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Howard_Dean] | [TOKENS: 7693]
Contents Howard Dean Howard Brush Dean III (born November 17, 1948) is an American physician, author, consultant, and retired politician who served as the 79th governor of Vermont from 1991 to 2003 and chair of the Democratic National Committee (DNC) from 2005 to 2009. Dean was an unsuccessful candidate for the Democratic nomination in the 2004 presidential election. Later, his implementation of the fifty-state strategy as head of the DNC is credited with the Democratic victories in the 2006 and 2008 elections. Afterward, he became a political commentator and consultant to McKenna Long & Aldridge, a law and lobbying firm. Before entering politics, Dean earned his medical degree from the Albert Einstein College of Medicine in 1978. Dean served as a member of the Vermont House of Representatives from 1983 to 1986 and as Lieutenant Governor of Vermont from 1987 to 1991. Both were part-time positions that enabled him to continue practicing medicine. In 1991, Dean became governor of Vermont when Richard A. Snelling died in office. Dean was subsequently elected to five two-year terms, serving from 1991 to 2003, making him the longest-serving governor in the history of the state of Vermont.[a] Dean served as chairman of the National Governors Association from 1994 to 1995; during his term, Vermont paid off much of its public debt and had a balanced budget 11 times, lowering income taxes twice. Dean also oversaw the expansion of the "Dr. Dynasaur" program, which ensures universal health care for children and pregnant women in the state. He is a noted staunch supporter of universal health care. Dean denounced the 2003 invasion of Iraq and called on Democrats to oppose the Bush administration. In the 2004 election, initially seen as a long-shot candidate, Dean pioneered Internet-based fundraising and grassroots organizing, which is centered on mass appeal to small donors which is more cost efficient than the more expensive contacting of fewer potential larger donors, and promotes active participatory democracy among the general public. As a result of his unconventional strategy, he became the top fundraiser and front runner for the Democratic Party presidential nomination. Dean had a disappointing third-place finish in the Iowa caucus, and his campaign suffered after negative reactions in the media to a hoarse "Yeah" that he shouted after enumerating states that he hoped to win and ending up losing the nomination to Senator John Kerry of Massachusetts. In 2004, Dean founded Democracy for America, a progressive political action committee. He was later elected chairman of the Democratic National Committee in February 2005. As chairman of the party, Dean created and employed the fifty-state strategy that attempted to make Democrats competitive in normally conservative states often dismissed in the past as "solid red". The success of the strategy became apparent during the 2006 midterm elections, where Democrats took back control of the House and Senate, winning Senate seats from normally Republican states such as Missouri and Montana. In the 2008 election, the Democrats increased their House and Senate majorities, while Barack Obama used the 50-state strategy as the backbone of his successful presidential candidacy. Dean was named chairman emeritus of the DNC upon his retirement in January 2009. Since retiring from the DNC chairman position, Dean has held neither elected office nor an official position in the Democratic Party and, as of 2015[update], was working for global law firm Dentons as part of the firm's public policy and regulation practice. In 2013, Dean expressed interest in running for the presidency in 2016, but instead supported former Secretary of State Hillary Clinton's run for president. Early life and education Dean was born in East Hampton, New York, to Andrée Belden (née Maitland), an art appraiser, and Howard Brush Dean Jr., an executive in the financial industry. Dean is the eldest of four brothers, including Jim Dean, Chair of Democracy for America, and Charles Dean, who was captured by the Pathet Lao and executed by the North Vietnamese while traveling through Southeast Asia in 1974. Howard's father worked at the stock brokerage firm of Dean Witter. The family was quite wealthy, Republican, and belonged to the exclusive Maidstone Golf Club in East Hampton. As a child, he spent much of his time growing up in East Hampton; the family built a house on Hook Pond there in the mid-1950s. While in New York, the family had a three-bedroom apartment on the Upper East Side along Park Avenue. Howard attended the Browning School in Manhattan until he was 13, and then went to St. George's School, a preparatory school in Middletown, Rhode Island. In September 1966, he attended Felsted School, UK, for one school year after winning an English Speaking Union scholarship. UPI quoted one of Dean's friends in his youth as saying, "By Hamptons standards, the Deans were not rich. No safaris in Africa or chalets in Switzerland. Howard's father went to work every day. He didn't own a company, or have a father or grandfather who founded one, as mine did." Peggy Noonan wrote in the Wall Street Journal that: he doesn't seem like a WASP. I know it's not nice to deal in stereotypes, but there seems very little Thurston Howell, III, or George Bush, the elder, for that matter, in Mr. Dean. ... He seems unpolished, doesn't hide his aggression, is proudly pugnacious. He doesn't look or act the part of the WASP ... It will be harder for Republicans to tag Mr. Dean as Son of the Maidstone Club than it was for Democrats to tag Bush One as Heir to Greenwich Country Day. He just doesn't act the part. Dean graduated from Yale University with a Bachelor of Arts in political science in 1971. As a freshman, he requested specifically to room with an African American. The university housing office complied, and Dean roomed with two Southern black students and one white student from Pennsylvania. One of Dean's roommates was Ralph Dawson, the son of a sheet metal worker in Charleston, South Carolina, and today a New York City labor lawyer. Dawson said of Dean: Unless you operated from a stereotypic understanding of the Yale white boy as rich, you wouldn't know that about Howard. ... When it came to race—and I don't know whether this was a function of intent or just came naturally—Howard was not patronizing in any way. He was willing to confront in discussion what a lot of white students weren't. He would hold his ground. He would respect that I knew forty-two million times more about being black than he did. But that didn't mean he couldn't hold a view on something relating to civil rights that would be as valid as mine. There were lots of well-meaning people at Yale who wanted you to understand that they understood your plight; you'd get into a conversation and they would yield too soon, so we didn't get the full benefit of the exchange. Howard very much thought he was capable of working an issue through. He was inquisitive. And when he came to a conclusion he would be as strong as anybody else. I don't think he's stubborn. He's a guy who's always been comfortable in his own skin. That's something you still see in him today, and it gets him into some degree of controversy. Though eventually eligible to be drafted into the military, he received a medical deferment because of an unfused vertebra. In response to Tim Russert asking on Meet the Press whether Dean could have served in the military had he not mentioned his back condition during his draft physical, Dean replied "I guess that's probably true. I mean, I was in no hurry to get into the military." He briefly tried a career as a stockbroker before deciding on a career in medicine, completing pre-medicine classes at Columbia University. In 1974, Dean's younger brother Charlie, who had been traveling through southeast Asia at the time, was captured and killed by Laotian guerrillas, a tragedy widely reported to have an enormous influence in Dean's life; for many years he has worn his brother's belt nearly every day as a memento. Dean received his Doctor of Medicine degree from the Yeshiva University Albert Einstein College of Medicine in 1978 and began a medical residency at the University of Vermont. Vermont political career In 1980, Dean spearheaded a grassroots campaign to stop a condominium development on Lake Champlain, instead favoring the construction of a bicycle trail. The effort succeeded and helped launch his political career. That same year, he was also a volunteer for Jimmy Carter's re-election campaign. In 1980, he was a Carter delegate at the Democratic National Convention. In 1981 he was elected chairman of the Chittenden County Democratic Committee. He served in this position until resigning in May 1984. In 1982, he was elected to the Vermont House of Representatives; he was reelected in 1984 and became assistant minority leader. He was elected lieutenant governor in 1986 and reelected in 1988 and 1990. All were part-time positions, and Dean continued to practice medicine alongside his wife until he became governor. On August 13, 1991, Dean was examining a patient when he received word that governor Richard A. Snelling had died of sudden cardiac arrest. Dean assumed the office, which he called the "greatest job in Vermont." He was subsequently elected to five two-year terms in his own right, making him the longest-serving governor in the state's history. From 1994 to 1995, Dean was the chairman of the National Governors Association. Dean was faced with an economic recession and a $60 million budget deficit. He bucked many in his own party to immediately push for a balanced budget, an act which marked the beginning of a record of fiscal restraint. During his tenure as governor, the state paid off much of its debt, balanced its budget eleven times, raised its bond rating, and lowered income taxes twice. Robert Dreyfuss wrote that as a fiscal conservative: Dean navigated a triangular course between the two parties, clashing often with the Democrats over taxes and spending -- and helping to drive many liberal-left Democrats into the arms of the Progressive Party and of Representative Bernie Sanders, Congress's lone socialist. Inheriting a fiscal crisis from Snelling, Dean slashed the budget and dramatically reduced taxes. During the 1990s, Dean repeatedly unsheathed his veto pen, and he often allied with a growing contingent of conservative Blue Dog Democrats and Republicans to outmaneuver the Democratic leadership on issues such as taxes. Dean also focused on health care issues, most notably through the "Dr. Dynasaur" program, which ensures near-universal health coverage for children and pregnant women in the state; the uninsured rate in Vermont fell from 10.8 percent in 1993 to 8.4 percent in 2000 under his watch. Child abuse and teen pregnancy rates were cut roughly in half. The first decision of his career to draw significant national attention came in 2000, after the Vermont Supreme Court, in Baker v. State, ruled that the state's marriage laws unconstitutionally excluded same-sex couples and ordered that the state legislature either allow gays and lesbians to marry or create a parallel status. Facing calls to amend the state constitution to prohibit either option, Dean chose to support the latter one, and signed the nation's first civil unions legislation into law, spurring a short-lived "Take Back Vermont" movement which helped Republicans gain control of the State House.[citation needed] Dean was criticized during his 2004 presidential campaign for another decision related to civil unions. Shortly before leaving office, he had some of his Vermont papers sealed for at least the next decade, a time frame longer than most outgoing governors use, stating that he was protecting the privacy of many gay supporters who sent him personal letters about the issue. On the campaign trail, he demanded that Vice President Dick Cheney release his energy committee papers. Many people, including Democratic Senator and failed 2004 presidential candidate Joe Lieberman of Connecticut, who would go on to leave the party after losing his primary for re-election in 2006, accused Dean of hypocrisy. Judicial Watch filed a lawsuit to force the papers be opened before the seal expired but lost.[citation needed] 2004 presidential candidacy Dean began his bid for president as a "long shot" candidate. ABC News ranked him eighth out of 12 in a list of potential presidential contenders in May 2002. In March 2003 he gave a speech strongly critical of the Democratic leadership at the California State Democratic Convention that attracted the attention of grassroots party activists and set the tone and the agenda of his candidacy. It began with the line: "What I want to know is what in the world so many Democrats are doing supporting the President's unilateral intervention in Iraq?" That summer, his campaign was featured as the cover article in The New Republic and in the following months he received expanded media attention. His campaign slowly gained steam, and by autumn of 2003, Dean had become the apparent frontrunner for the Democratic nomination, performing strongly in most polls and outpacing his rivals in fundraising. This latter feat was attributed mainly to his innovative embrace of the Internet for campaigning, using Meetup.com to track supporters and encourage grassroots participation in the campaign. The majority of his donations came from individual Dean supporters, who came to be known as Deanites, or, more commonly, Deaniacs, a term coined to describe meetup participants, who passed out campaign materials supporting Dean and the broader movement. (Critics often labeled them "Deany Boppers", or "Deanie Babies", a reference to his support from young activists.) Following Dean's presidential campaign, some Deaniacs remained engaged in the political process through Democracy for America and similar locally oriented organizations. Dean began his campaign by emphasizing healthcare policy and ‘fiscal responsibility’, and championing grassroots fundraising as a way to fight lobby groups. However, his opposition to the U.S. plan to invade Iraq (and his forceful criticism of Democrats in Congress who voted to authorize the use of force) quickly eclipsed other issues. By challenging the war in Iraq at a time when mainstream Democratic leaders were either neutral or cautiously supportive, Dean positioned himself to appeal to his party's activist base. Dean often quoted the late Minnesota Senator Paul Wellstone (who had recently died in a plane crash) as saying that he represented "the Democratic wing of the Democratic Party." His message resonated among frustrated Democratic primary voters who felt that their party hadn't done enough to oppose the policies of the Republicans. Thus, Dean also succeeded in differentiating himself from his primary opponents.[citation needed] Dean's approach organizationally was also novel. His campaign made extensive use of the Internet, pioneering techniques that were subsequently adopted by politicians of all political persuasions. His supporters organized real-world meetings, many of them arranged through Meetup.com, participated in online forums, donated money online, canvassed for advertising ideas, and distributed political talking points. In terms of money, publicity and activism, Dean therefore quickly staked out a leadership position in the field of candidates. In this way, he was able to bypass existing party and activist infrastructure and built his own online network of supporters. In terms of traditional "ground troops", however, Dean remained at a disadvantage. Dean adopted a coffee shop strategy to visit grassroot activists in all 99 Iowa counties, but he lacked the campaign infrastructure to get voters to the polls that his opponents had.[citation needed] In the "Invisible Primary" of raising campaign funds, Howard Dean led the Democratic pack in the early stages of the 2004 campaign. Among the candidates, he ranked first in total raised ($25.4 million as of September 30, 2003) and first in cash-on-hand ($12.4 million). However, even this performance paled next to that of George W. Bush, who by that date had raised $84.6 million for the Republican primary campaign, in which he had no strong challenger. Prior to the 2004 primary season, the Democratic record for most money raised in one quarter by a primary candidate was held by Bill Clinton in 1995, raising $10.3 million during a campaign in which he had no primary opponent. In the third quarter of 2003, the Dean campaign raised $14.8 million, shattering Clinton's record. All told, Dean's campaign raised around $50 million. While presidential campaigns have traditionally obtained finance by tapping wealthy, established political donors, Dean's funds came largely in small donations over the Internet; the average overall donation was just under $80. This method of fundraising offered several important advantages over traditional fundraising, in addition to the inherent media interest in what was then a novelty. First, raising money on the Internet was relatively inexpensive, compared to conventional methods such as events, telemarketing, and direct mail campaigns. Secondly, as donors on average contributed far less than the legal limit ($2,000 per person), the campaign could continue to resolicit them throughout the election season. Dean's director of grassroots fundraising, Larry Biddle, came up with the idea of the popular fundraising "bat", an image of a cartoon baseball player and bat which appeared on the site every time the campaign launched a fundraising challenge. The bat encouraged Web site visitors to contribute money immediately through their credit cards. This would lead to the bat filling up like a thermometer with the red color indicating the total funds. The site often took suggestions from the netroots on their blog. One of these suggestions led to one of the campaign's biggest accomplishments: an image of Dean eating a turkey sandwich encouraged supporters to donate $250,000 in three days to match a big-donor dinner by Vice President Dick Cheney. The online contributions from that day matched what Cheney made from his fundraiser. In November 2003, after a much-publicized online vote among his followers, Dean became the first Democrat to forgo federal matching funds (and the spending limits that go with them) since the system was established in 1974. (John Kerry later followed his lead.) In addition to state-by-state spending limits for the primaries, the system limits a candidate to spending only $44.6 million until the Democratic National Convention in July, which sum would almost certainly run out soon after the early primary season. (George W. Bush declined federal matching funds in 2000 and did so again for the 2004 campaign.) In a sign that the Dean campaign was starting to think beyond the primaries, they began in late 2003 to speak of a "$100 revolution" in which two million Americans would give $100 in order to compete with Bush. Political commentators have stated that the fundraising of Barack Obama, with its emphasis on small donors and the internet, refined and built upon the model that Dean's campaign pioneered. Though Dean lagged in early endorsements, he acquired many critical ones as his campaign snowballed. By the time of the Iowa caucuses, he led among commitments from superdelegates– elected officials and party officers entitled to convention votes by virtue of their positions. On November 12, 2003, he received the endorsements of the Service Employees International Union and the American Federation of State, County and Municipal Employees. Dean received the endorsement of former vice president and 2000 presidential candidate Al Gore, on December 9, 2003. In the following weeks Dean was endorsed by former U.S. senators Bill Bradley and Carol Moseley Braun, unsuccessful Democratic presidential candidates from the 2000 and 2004 primaries, respectively. Other high-profile endorsers included: Several celebrities from the entertainment industry endorsed him: Joan Jett, Martin Sheen, Rob Reiner, Susan Sarandon, Paul Newman, Robin Williams, and Joseph Gordon-Levitt. Many pundits blamed such endorsements for the campaign's eventual collapse. In particular, Al Gore's early endorsement of Dean weeks before the first primary of the election cycle was severely criticized by eight Democratic contenders particularly since he did not endorse his former running mate, Joe Lieberman. Gore supported Dean over Lieberman due to their differing opinions on Iraq which began to develop around 2002 (Lieberman supported the war and Gore did not). When Dean's campaign failed, some blamed Gore's early endorsement. On January 19, 2004, Dean's rivals John Kerry and John Edwards pushed him into a third-place finish in the 2004 Iowa Democratic caucuses, representing the first votes cast in primary season. Dean's loud outburst in his public address that night was widely rebroadcast and portrayed as a media gaffe that ended his campaign. According to a Newsday editorial written by Verne Gay, some members of the television audience criticized the speech as loud, peculiar, and unpresidential. In particular, this quote from the speech was aired repeatedly in the days following the caucus: Not only are we going to New Hampshire, Tom Harkin, we're going to South Carolina and Oklahoma and Arizona and North Dakota and New Mexico, and we're going to California and Texas and New York. ... And we're going to South Dakota and Oregon and Washington and Michigan, and then we're going to Washington, D.C., to take back the White House! Yeah!ⓘ Senator Harkin was on stage with Dean, holding his suit jacket. This final "Yeah!" with its unusual tone that Dean later said was due to the cracking of his hoarse voice, has become known in American political jargon as the "Dean Scream" or the "I Have a Scream" speech. Comedians and late-night comedy show hosts such as Dave Chappelle and Conan O'Brien satirized, mocked, and popularized the sound bite, beginning a media onslaught that many believe contributed immensely to his poor showing in the subsequent races. Dean conceded that the speech did not project the best image, jokingly referring to it as a "crazy, red-faced rant" on the Late Show with David Letterman. In an interview later that week with Diane Sawyer, he said he was "a little sheepish ... but I'm not apologetic." Sawyer and many others in the national broadcast news media later expressed some regret about overplaying the story. CNN issued a public apology and admitted in a statement that they might have "overplayed" the incident. The incessant replaying of the "Dean Scream" by the press became a debate on the topic of whether Dean was the victim of media bias. The scream scene was shown an estimated 633 times by cable and broadcast news networks in just four days following the incident, a number that does not include talk shows and local news broadcasts. Some in the audience that day reported that they were unaware of the "scream" until they saw it on TV. Dean said after the general election in 2004, that his microphone only picked up his voice and did not also capture the loud cheering he received from the audience as a result of the speech. On January 27, Dean finished second to Kerry in the New Hampshire primary. As late as one week before the first votes were cast in Iowa's caucuses, Dean had enjoyed a 30% lead in New Hampshire opinion polls;[citation needed] accordingly, this loss represented another major setback to his campaign. Iowa and New Hampshire were the first in a string of losses for the Dean campaign, culminating in a third place showing in the Wisconsin primary on February 17. Two days before the Wisconsin primary, campaign advisor Steve Grossman announced through an article written by The New York Times Dean campaign correspondent Jodi Wilgoren that he would offer his services to any of the other major candidates "should Dean not win in Wisconsin." This scoop further undermined Dean's campaign. Grossman later issued a public apology. The next day, Dean announced that his candidacy had "come to an end", though he continued to urge people to vote for him, so that Dean delegates would be selected for the convention and could influence the party platform. He later won the Vermont primary on Super Tuesday, March 2. This latter victory, a surprise even to Dean, was due in part to the lack of a serious anti-Kerry candidate in Vermont (John Edwards had declined to put his name on the state's ballot, expecting Dean to win in a landslide), and in part to a television ad produced, funded, and aired in Vermont by grassroots Dean supporters. The New York Observer attributed Barack Obama's success in the 2008 presidential election to his perfection of the internet organizing model that Dean pioneered. On October 11, 2007, it was reported that Leonardo DiCaprio and George Clooney were in early talks about making a "political thriller" based on Howard Dean's 2004 campaign, tentatively titled Farragut North. The movie, finally titled The Ides of March, was released on October 7, 2011. It is based on the play Farragut North, which was named after the Washington Metro station located in the center of the lobbyist district. The play was written by Beau Willimon, a staffer on the Dean campaign. The main character is based on a former press secretary for the Dean campaign. In November 2008, a documentary film about Dean and his campaign, Dean and Me, was released and shown at several film festivals around the country.[citation needed] Following Dean's withdrawal after the Wisconsin primary, he pledged to support the eventual Democratic nominee. He remained neutral until John Kerry became the presumptive nominee. Dean endorsed Kerry on March 25, 2004, in a speech at The George Washington University in Washington, D.C. On March 18, 2004, Dean founded the group Democracy for America. This group was created to house the large, Internet-based organization Dean created for his presidential campaign. Its goal is to help like-minded candidates get elected to local, state, and federal offices. It has endorsed several sets of twelve candidates known as the Dean Dozen. Dean turned over control of the organization to his brother, Jim Dean, when he became Chairman of the Democratic National Committee. Dean strongly urged his supporters to support Kerry as opposed to Ralph Nader, arguing that a vote for Nader would only help to re-elect President George W. Bush because he believed that most who vote for Nader are likely to have voted for Kerry if Ralph Nader was not running. Dean argued that Nader would be more effective if he lobbied on election law reform issues during his campaign. Dean supported several election law reform issues such as campaign finance reform and instant-runoff voting. DNC Chairmanship Dean was elected Chairman of the Democratic National Committee (DNC) on February 12, 2005, after all his opponents dropped out of the race when it became apparent Dean had the votes to become chair. Those opponents included former Congressman Martin Frost, former Denver Mayor Wellington Webb, former Congressman and 9/11 Commissioner Tim Roemer, and strategists Donnie Fowler, David Leland, and Simon Rosenberg. Many prominent Democrats opposed Dean's campaign; House Leader Nancy Pelosi and Senate Leader Harry Reid are rumored to be among them. Dean satisfied his critics by promising to focus on fundraising and campaigning as DNC Chair and avoid policy statements. He was succeeded by Tim Kaine, who at the time of his election was the governor of Virginia, in 2009. Dean ran for the position a second time in 2016. Two days after Hillary Clinton's defeat in the 2016 presidential election, he announced that he would again seek the chairmanship. There were other contenders at the time who had been endorsed by Senator Bernie Sanders of Vermont, and Senate Minority Leader-elect Chuck Schumer of New York. On December 2, 2016, Dean withdrew his candidacy. During his 2005-2009 tenure, he promoted a "fifty-state strategy" and developed innovative fund-raising strategies. After Dean became Chairman of the DNC, he pledged to bring reform to the Party. Rather than focusing just on swing states, Dean proposed what has come to be known as the fifty-state strategy, the goal of which was for the Democratic Party to be committed to winning elections at every level in every region of the country, with Democrats organized in every single voting precinct. State party chairs lauded Dean for raising money directly for the individual state parties. Dean's strategy used a post-Watergate model taken from the Republicans of the mid-seventies. Working at the local, state and national level, the GOP built the party from the ground up. Dean's plan was to seed the local level with young and committed candidates, building them into state candidates in future races. Dean traveled extensively throughout the country with the plan, including places like Utah, Mississippi, and Texas, states in which Republicans had dominated the political landscape. Many establishment Democrats were at least initially dubious about the strategy's worth—political consultant and former Bill Clinton advisor, Paul Begala, suggested that Dean's plan was "just hiring a bunch of staff people to wander around Utah and Mississippi and pick their nose." Further changes were made in attempting to make the stated platform of the Democratic Party more coherent and compact. Overhauling the website, the official platform of the 2004 campaign, which was largely criticized as avoiding key issues and being the product of party insiders, was replaced with a simplified, though comprehensive categorizing of positions on a wide range of issues. Dean's strategy arguably paid off in a historic victory as the Democrats took over control of the House of Representatives and the Senate in the 2006 mid-term elections. While it is likely this is also attributable to the shortcomings of the Republican Party in their dealings with the Iraq War and the scandals that occurred shortly before the election, Dean's emphasis on connecting with socially conservative, economic moderates in Republican-dominated states appears to have made some impact. Indeed, Democratic candidates won elections in such red states as Kansas, Indiana, and Montana. And while former Clinton strategist James Carville criticized Dean's efforts, saying more seats could have been won with the traditional plan of piling money solely into close races, the results and the strategy were met with tremendous approval by the party's executive committee in its December 2006 meeting. While he was chairman of the DCCC, Rahm Emanuel was known to have had disagreements over election strategy with Dean; Emanuel believed a more tactical approach, focusing attention on key districts, was necessary to ensure victory. Emanuel himself was criticised for his failure to support some progressive candidates, as Dean advocated. The 50-state strategy relied on the idea that building the Democratic Party is at once an incremental election by election process as well as a long-term vision in party building. Democrats cannot compete in counties in which they do not field candidates. Therefore, candidate recruitment emerged as a component element of the 50-state strategy. To build the party, the DNC under Dean worked in partnership with state Democratic parties in bringing the resources of the DNC to bear in electoral efforts, voter registration, candidate recruitment, and other interlocking component elements of party building. Decentralization was also a core component of the party's approach. The idea was that each state party had unique needs but could improve upon its efforts through the distribution of resources from the national party. The 50-state strategy was acknowledged by political commentators as an important factor in allowing Barack Obama to compete against John McCain in many states that were previously considered solidly Republican during the 2008 presidential election, most notably when it comes to Obama's victories in Indiana, North Carolina, and Virginia. Through grassroots fundraising, Howard Dean was able to raise millions more than the previous DNC Chairman at the same point after the 2000 election. The year after his election, Dean had raised the most money by any DNC Chairman in a similar post election period. This was especially apparent when the Federal Election Commission reported that the DNC had raised roughly $86.3 million in the first six months of 2005, an increase of over 50% on the amount raised during the same period of 2003. In comparison, the RNC fundraising activities represented a gain of only 2%. Additional attempts to capitalize on this trend was the introduction of "Democracy Bonds," a program under which small donors would give a set amount every month. Although it only reached over 31,000 donors by May 2006, far off-pace from the stated goal of 1 million by 2008, it nonetheless contributed to a new small-donor funding philosophy of the DNC. Dean continued to further develop online fundraising at the DNC. Just one month before Election Day 2006, he became the first to introduce the concept of a "grassroots match," where donors to the DNC pledged to match the first donation made by a new contributor. The DNC stated that the resulting flood of contributions led to 10,000 first-time donors in just a few days. Post-DNC career Supporters of Dean were angry that he was not given a position in the new Obama administration and not invited to the press conference at which Tim Kaine was introduced as his successor as Democratic National Committee chairman. Joe Trippi, who was Dean's presidential campaign manager in 2004, told Politico, "[Dean] was never afraid to challenge the way party establishment in Washington did business, and that doesn't win you friends in either party." Trippi further explained the apparent snub of Dean by stating, "You don't have to look any further than Rahm Emanuel." Trippi was referring to the tension between Emanuel and Dean over Dean's 50 state strategy. Sources close to Emanuel dismissed these charges. Dean said: "I didn't do this for the spoils. I did this for the country. I'm very happy that Barack Obama is president, and I think he's picked a great Cabinet. And I'm pretty happy. I wouldn't trade my position for any other position right now. I'm going to go into the private sector, make a living making speeches, and do a lot of stuff on health care policy." When asked about not being selected for a position in the Obama administration, Dean responded, "Obviously, it would have been great, but it's not happening and the president has the right to name his own Cabinet, so I'm not going to work in the government it looks like." When asked how he felt about not being selected, Dean replied he would "punt on that one." After the withdrawal of Tom Daschle's nomination for the position, Dean had been touted by many for the post of Secretary of Health and Human Services. After being passed over for the post once again, Dean commented: "I was pretty clear that I would have liked to have been Secretary of HHS but it is the president's choice and he decided to go in a different direction." Outside the US Dean is a supporter of the Liberal Democrats party of the United Kingdom. He has close links with the party and has spoken at their party conference in the past. Since the UK began the Brexit process, he has continued to tweet his support for the party. After leaving office Dean emerged as a major supporter of the People's Mujahedin of Iran (Mujahedeen-e-Khalq, or MEK), calling for it to be delisted as a terrorist group by the United States, which occurred on September 21, 2012. Dean endorsed Hillary Clinton during the 2016 presidential election instead of Vermont Senator Bernie Sanders from his home state in September 2015. Dean questioned on Twitter whether Donald Trump's sniffing during a presidential debate was due to cocaine use, and later apologized for "using innuendo." In a January 2009 interview with the Associated Press, Dean indicated he would enter the private sector after 30 years in politics. Dean told the AP he would deliver speeches and share ideas about campaigns and technology with center-left political parties around the world. He became a contributor to the news network MSNBC in shows such as The Last Word with Lawrence O'Donnell. He has also guest hosted Countdown with Keith Olbermann and The Rachel Maddow Show. He is on the board of the National Democratic Institute. Dean also serves as a Senior Presidential Fellow at Hofstra University. He has been a Senior Fellow at the Yale Jackson Institute for Global Affairs and a visiting professor at Williams College.[citation needed] He has been a Senior Strategic Advisor and Independent Consultant for the Government Affairs practice at McKenna, Long & Aldridge. In December 2018 Dean joined the advisory board of Tilray, one of the world's largest cannabis companies. Dean is a member of the Canadian American Business Council's Advisory Board. Personal life In 1981, Dean married fellow doctor Judith Steinberg, whom he met in medical school, and together they began a family medical practice in Shelburne, Vermont (where she continued to use her maiden name to avoid confusion). Although raised as an Episcopalian, Dean joined a Congregational church in 1982 after a dispute with the local Episcopal diocese over a bike trail. By his own account, he did not attend church as of the early 2000s. At one point, when asked to name his favorite book in the New Testament, he offered the Old Testament Book of Job, then corrected himself an hour later. Dean has stated he is "more spiritual than religious". He and his Jewish wife Judith Steinberg Dean have raised their two children, Anne and Paul, in a secular education, and both children self-identify as Jews. Electoral history Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Meta_Platforms#cite_note-yahoo-49] | [TOKENS: 8626]
Contents Meta Platforms Meta Platforms, Inc. (doing business as Meta) is an American multinational technology company headquartered in Menlo Park, California. Meta owns and operates several prominent social media platforms and communication services, including Facebook, Instagram, WhatsApp, Messenger, Threads and Manus. The company also operates an advertising network for its own sites and third parties; as of 2023[update], advertising accounted for 97.8 percent of its total revenue. Meta has been described as a part of Big Tech, which refers to the largest six tech companies in the United States, Alphabet (Google), Amazon, Apple, Meta (Facebook), Microsoft, and Nvidia, which are also the largest companies in the world by market capitalization. The company was originally established in 2004 as TheFacebook, Inc., and was renamed Facebook, Inc. in 2005. In 2021, it rebranded as Meta Platforms, Inc. to reflect a strategic shift toward developing the metaverse—an interconnected digital ecosystem spanning virtual and augmented reality technologies. In 2023, Meta was ranked 31st on the Forbes Global 2000 list of the world's largest public companies. As of 2022, it was the world's third-largest spender on research and development, with R&D expenses totaling US$35.3 billion. History Facebook filed for an initial public offering (IPO) on January 1, 2012. The preliminary prospectus stated that the company sought to raise $5 billion, had 845 million monthly active users, and a website accruing 2.7 billion likes and comments daily. After the IPO, Zuckerberg would retain 22% of the total shares and 57% of the total voting power in Facebook. Underwriters valued the shares at $38 each, valuing the company at $104 billion, the largest valuation yet for a newly public company. On May 16, one day before the IPO, Facebook announced it would sell 25% more shares than originally planned due to high demand. The IPO raised $16 billion, making it the third-largest in US history (slightly ahead of AT&T Mobility and behind only General Motors and Visa). The stock price left the company with a higher market capitalization than all but a few U.S. corporations—surpassing heavyweights such as Amazon, McDonald's, Disney, and Kraft Foods—and made Zuckerberg's stock worth $19 billion. The New York Times stated that the offering overcame questions about Facebook's difficulties in attracting advertisers to transform the company into a "must-own stock". Jimmy Lee of JPMorgan Chase described it as "the next great blue-chip". Writers at TechCrunch, on the other hand, expressed skepticism, stating, "That's a big multiple to live up to, and Facebook will likely need to add bold new revenue streams to justify the mammoth valuation." Trading in the stock, which began on May 18, was delayed that day due to technical problems with the Nasdaq exchange. The stock struggled to stay above the IPO price for most of the day, forcing underwriters to buy back shares to support the price. At the closing bell, shares were valued at $38.23, only $0.23 above the IPO price and down $3.82 from the opening bell value. The opening was widely described by the financial press as a disappointment. The stock set a new record for trading volume of an IPO. On May 25, 2012, the stock ended its first full week of trading at $31.91, a 16.5% decline. On May 22, 2012, regulators from Wall Street's Financial Industry Regulatory Authority announced that they had begun to investigate whether banks underwriting Facebook had improperly shared information only with select clients rather than the general public. Massachusetts Secretary of State William F. Galvin subpoenaed Morgan Stanley over the same issue. The allegations sparked "fury" among some investors and led to the immediate filing of several lawsuits, one of them a class action suit claiming more than $2.5 billion in losses due to the IPO. Bloomberg estimated that retail investors may have lost approximately $630 million on Facebook stock since its debut. S&P Global Ratings added Facebook to its S&P 500 index on December 21, 2013. On May 2, 2014, Zuckerberg announced that the company would be changing its internal motto from "Move fast and break things" to "Move fast with stable infrastructure". The earlier motto had been described as Zuckerberg's "prime directive to his developers and team" in a 2009 interview in Business Insider, in which he also said, "Unless you are breaking stuff, you are not moving fast enough." In November 2016, Facebook announced the Microsoft Windows client of gaming service Facebook Gameroom, formerly Facebook Games Arcade, at the Unity Technologies developers conference. The client allows Facebook users to play "native" games in addition to its web games. The service was closed in June 2021. Lasso was a short-video sharing app from Facebook similar to TikTok that was launched on iOS and Android in 2018 and was aimed at teenagers. On July 2, 2020, Facebook announced that Lasso would be shutting down on July 10. In 2018, the Oculus lead Jason Rubin sent his 50-page vision document titled "The Metaverse" to Facebook's leadership. In the document, Rubin acknowledged that Facebook's virtual reality business had not caught on as expected, despite the hundreds of millions of dollars spent on content for early adopters. He also urged the company to execute fast and invest heavily in the vision, to shut out HTC, Apple, Google and other competitors in the VR space. Regarding other players' participation in the metaverse vision, he called for the company to build the "metaverse" to prevent their competitors from "being in the VR business in a meaningful way at all". In May 2019, Facebook founded Libra Networks, reportedly to develop their own stablecoin cryptocurrency. Later, it was reported that Libra was being supported by financial companies such as Visa, Mastercard, PayPal and Uber. The consortium of companies was expected to pool in $10 million each to fund the launch of the cryptocurrency coin named Libra. Depending on when it would receive approval from the Swiss Financial Market Supervisory authority to operate as a payments service, the Libra Association had planned to launch a limited format cryptocurrency in 2021. Libra was renamed Diem, before being shut down and sold in January 2022 after backlash from Swiss government regulators and the public. During the COVID-19 pandemic, the use of online services, including Facebook, grew globally. Zuckerberg predicted this would be a "permanent acceleration" that would continue after the pandemic. Facebook hired aggressively, growing from 48,268 employees in March 2020 to more than 87,000 by September 2022. Following a period of intense scrutiny and damaging whistleblower leaks, news started to emerge on October 21, 2021 about Facebook's plan to rebrand the company and change its name. In the Q3 2021 earnings call on October 25, Mark Zuckerberg discussed the ongoing criticism of the company's social services and the way it operates, and pointed to the pivoting efforts to building the metaverse – without mentioning the rebranding and the name change. The metaverse vision and the name change from Facebook, Inc. to Meta Platforms was introduced at Facebook Connect on October 28, 2021. Based on Facebook's PR campaign, the name change reflects the company's shifting long term focus of building the metaverse, a digital extension of the physical world by social media, virtual reality and augmented reality features. "Meta" had been registered as a trademark in the United States in 2018 (after an initial filing in 2015) for marketing, advertising, and computer services, by a Canadian company that provided big data analysis of scientific literature. This company was acquired in 2017 by the Chan Zuckerberg Initiative (CZI), a foundation established by Zuckerberg and his wife, Priscilla Chan, and became one of their projects. Following the rebranding announcement, CZI announced that it had already decided to deprioritize the earlier Meta project, thus it would be transferring its rights to the name to Meta Platforms, and the previous project would end in 2022. Soon after the rebranding, in early February 2022, Meta reported a greater-than-expected decline in profits in the fourth quarter of 2021. It reported no growth in monthly users, and indicated it expected revenue growth to stall. It also expected measures taken by Apple Inc. to protect user privacy to cost it some $10 billion in advertisement revenue, an amount equal to roughly 8% of its revenue for 2021. In meeting with Meta staff the day after earnings were reported, Zuckerberg blamed competition for user attention, particularly from video-based apps such as TikTok. The 27% reduction in the company's share price which occurred in reaction to the news eliminated some $230 billion of value from Meta's market capitalization. Bloomberg described the decline as "an epic rout that, in its sheer scale, is unlike anything Wall Street or Silicon Valley has ever seen". Zuckerberg's net worth fell by as much as $31 billion. Zuckerberg owns 13% of Meta, and the holding makes up the bulk of his wealth. According to published reports by Bloomberg on March 30, 2022, Meta turned over data such as phone numbers, physical addresses, and IP addresses to hackers posing as law enforcement officials using forged documents. The law enforcement requests sometimes included forged signatures of real or fictional officials. When asked about the allegations, a Meta representative said, "We review every data request for legal sufficiency and use advanced systems and processes to validate law enforcement requests and detect abuse." In June 2022, Sheryl Sandberg, the chief operating officer of 14 years, announced she would step down that year. Zuckerberg said that Javier Olivan would replace Sandberg, though in a “more traditional” role. In March 2022, Meta (except Meta-owned WhatsApp) and Instagram were banned in Russia and added to the Russian list of terrorist and extremist organizations for alleged Russophobia and hate speech (up to genocidal calls) amid the ongoing Russian invasion of Ukraine. Meta appealed against the ban, but it was upheld by a Moscow court in June of the same year. Also in March 2022, Meta and Italian eyewear giant Luxottica released Ray-Ban Stories, a series of smartglasses which could play music and take pictures. Meta and Luxottica parent company EssilorLuxottica declined to disclose sales on the line of products as of September 2022, though Meta has expressed satisfaction with its customer feedback. In July 2022, Meta saw its first year-on-year revenue decline when its total revenue slipped by 1% to $28.8bn. Analysts and journalists accredited the loss to its advertising business, which has been limited by Apple's app tracking transparency feature and the number of people who have opted not to be tracked by Meta apps. Zuckerberg also accredited the decline to increasing competition from TikTok. On October 27, 2022, Meta's market value dropped to $268 billion, a loss of around $700 billion compared to 2021, and its shares fell by 24%. It lost its spot among the top 20 US companies by market cap, despite reaching the top 5 in the previous year. In November 2022, Meta laid off 11,000 employees, 13% of its workforce. Zuckerberg said the decision to aggressively increase Meta's investments had been a mistake, as he had wrongly predicted that the surge in e-commerce would last beyond the COVID-19 pandemic. He also attributed the decline to increased competition, a global economic downturn and "ads signal loss". Plans to lay off a further 10,000 employees began in April 2023. The layoffs were part of a general downturn in the technology industry, alongside layoffs by companies including Google, Amazon, Tesla, Snap, Twitter and Lyft. Starting from 2022, Meta scrambled to catch up to other tech companies in adopting specialized artificial intelligence hardware and software. It had been using less expensive CPUs instead of GPUs for AI work, but that approach turned out to be less efficient. The company gifted the Inter-university Consortium for Political and Social Research $1.3 million to finance the Social Media Archive's aim to make their data available to social science research. In 2023, Ireland's Data Protection Commissioner imposed a record EUR 1.2 billion fine on Meta for transferring data from Europe to the United States without adequate protections for EU citizens.: 250 In March 2023, Meta announced a new round of layoffs that would cut 10,000 employees and close 5,000 open positions to make the company more efficient. Meta revenue surpassed analyst expectations for the first quarter of 2023 after announcing that it was increasing its focus on AI. On July 6, Meta launched a new app, Threads, a competitor to Twitter. Meta announced its artificial intelligence model Llama 2 in July 2023, available for commercial use via partnerships with major cloud providers like Microsoft. It was the first project to be unveiled out of Meta's generative AI group after it was set up in February. It would not charge access or usage but instead operate with an open-source model to allow Meta to ascertain what improvements need to be made. Prior to this announcement, Meta said it had no plans to release Llama 2 for commercial use. An earlier version of Llama was released to academics. In August 2023, Meta announced its permanent removal of news content from Facebook and Instagram in Canada due to the Online News Act, which requires Canadian news outlets to be compensated for content shared on its platform. The Online News Act was in effect by year-end, but Meta will not participate in the regulatory process. In October 2023, Zuckerberg said that AI would be Meta's biggest investment area in 2024. Meta finished 2023 as one of the best-performing technology stocks of the year, with its share price up 150 percent. Its stock reached an all-time high in January 2024, bringing Meta within 2% of achieving $1 trillion market capitalization. In November 2023 Meta Platforms launched an ad-free service in Europe, allowing subscribers to opt-out of personal data being collected for targeted advertising. A group of 28 European organizations, including Max Schrems' advocacy group NOYB, the Irish Council for Civil Liberties, Wikimedia Europe, and the Electronic Privacy Information Center, signed a 2024 letter to the European Data Protection Board (EDPB) expressing concern that this subscriber model would undermine privacy protections, specifically GDPR data protection standards. Meta removed the Facebook and Instagram accounts of Iran's Supreme Leader Ali Khamenei in February 2024, citing repeated violations of its Dangerous Organizations & Individuals policy. As of March, Meta was under investigation by the FDA for alleged use of their social media platforms to sell illegal drugs. On 16 May 2024, the European Commission began an investigation into Meta over concerns related to child safety. In May 2023, Iraqi social media influencer Esaa Ahmed-Adnan encountered a troubling issue when Instagram removed his posts, citing false copyright violations despite his content being original and free from copyrighted material. He discovered that extortionists were behind these takedowns, offering to restore his content for $3,000 or provide ongoing protection for $1,000 per month. This scam, exploiting Meta’s rights management tools, became widespread in the Middle East, revealing a gap in Meta’s enforcement in developing regions. An Iraqi nonprofit Tech4Peace’s founder, Aws al-Saadi helped Ahmed-Adnan and others, but the restoration process was slow, leading to significant financial losses for many victims, including prominent figures like Ammar al-Hakim. This situation highlighted Meta’s challenges in balancing global growth with effective content moderation and protection. On 16 September 2024, Meta announced it had banned Russian state media outlets from its platforms worldwide due to concerns about "foreign interference activity." This decision followed allegations that RT and its employees funneled $10 million through shell companies to secretly fund influence campaigns on various social media channels. Meta's actions were part of a broader effort to counter Russian covert influence operations, which had intensified since the invasion. At its 2024 Connect conference, Meta presented Orion, its first pair of augmented reality glasses. Though Orion was originally intended to be sold to consumers, the manufacturing process turned out to be too complex and expensive. Instead, the company pivoted to producing a small number of the glasses to be used internally. On 4 October 2024, Meta announced about its new AI model called Movie Gen, capable of generating realistic video and audio clips based on user prompts. Meta stated it would not release Movie Gen for open development, preferring to collaborate directly with content creators and integrate it into its products by the following year. The model was built using a combination of licensed and publicly available datasets. On October 31, 2024, ProPublica published an investigation into deceptive political advertisement scams that sometimes use hundreds of hijacked profiles and facebook pages run by organized networks of scammers. The authors cited spotty enforcement by Meta as a major reason for the extent of the issue. In November 2024, TechCrunch reported that Meta were considering building a $10bn global underwater cable spanning 25,000 miles. In the same month, Meta closed down 2 million accounts on Facebook and Instagram that were linked to scam centers in Myanmar, Laos, Cambodia, the Philippines, and the United Arab Emirates doing pig butchering scams. In December 2024, Meta announced that, beginning February 2025, they would require advertisers to run ads about financial services in Australia to verify information about who are the beneficiary and the payer in a bid to regulate scams. On December 4, 2024, Meta announced it will invest US$10 billion for its largest AI data center in northeast Louisiana, powered by natural gas facilities. On the 11th of that month, Meta experienced a global outage, impacting accounts on all of their social media and messaging applications. Outage reports from DownDetector reached 70,000+ and 100,000+ within minutes for Instagram and Facebook, respectively. In January 2025, Meta announced plans to roll back its diversity, equity, and inclusion (DEI) initiatives, citing shifts in the "legal and policy landscape" in the United States following the 2024 presidential election. The decision followed reports that CEO Mark Zuckerberg sought to align the company more closely with the incoming Trump administration, including changes to content moderation policies and executive leadership. The new content moderation policies continued to bar insults about a person's intellect or mental illness, but made an exception to allow calling LGBTQ people mentally ill because they are gay or transgender. Later that month, Meta agreed to pay $25 million to settle a 2021 lawsuit brought by Donald Trump for suspending his social media accounts after the January 6 riots. Changes to Meta's moderation policies were controversial among its oversight board, with a significant divide in opinion between the board's US conservatives and its global members. In June 2025, Meta Platforms Inc. has decided to make a multibillion-dollar investment into artificial intelligence startup Scale AI. The financing could exceed $10 billion in value which would make it one of the largest private company funding events of all time. In October 2025, it was announced that Meta would be laying off 600 employees in the artificial intelligence unit to perform better and simpler. They referred to their AI unit as "bloated" and are seeking to trim down the department. This mass layoff is going to impact Meta’s AI infrastructure units, Fundamental Artificial Intelligence Research unit (FAIR) and other product-related positions. Mergers and acquisitions Meta has acquired multiple companies (often identified as talent acquisitions). One of its first major acquisitions was in April 2012, when it acquired Instagram for approximately US$1 billion in cash and stock. In October 2013, Facebook, Inc. acquired Onavo, an Israeli mobile web analytics company. In February 2014, Facebook, Inc. announced it would buy mobile messaging company WhatsApp for US$19 billion in cash and stock. The acquisition was completed on October 6. Later that year, Facebook bought Oculus VR for $2.3 billion in cash and stock, which released its first consumer virtual reality headset in 2016. In late November 2019, Facebook, Inc. announced the acquisition of the game developer Beat Games, responsible for developing one of that year's most popular VR games, Beat Saber. In Late 2022, after Facebook Inc rebranded to Meta Platforms Inc, Oculus was rebranded to Meta Quest. In May 2020, Facebook, Inc. announced it had acquired Giphy for a reported cash price of $400 million. It will be integrated with the Instagram team. However, in August 2021, UK's Competition and Markets Authority (CMA) stated that Facebook, Inc. might have to sell Giphy, after an investigation found that the deal between the two companies would harm competition in display advertising market. Facebook, Inc. was fined $70 million by CMA for deliberately failing to report all information regarding the acquisition and the ongoing antitrust investigation. In October 2022, the CMA ruled for a second time that Meta be required to divest Giphy, stating that Meta already controls half of the advertising in the UK. Meta agreed to the sale, though it stated that it disagrees with the decision itself. In May 2023, Giphy was divested to Shutterstock for $53 million. In November 2020, Facebook, Inc. announced that it planned to purchase the customer-service platform and chatbot specialist startup Kustomer to promote companies to use their platform for business. It has been reported that Kustomer valued at slightly over $1 billion. The deal was closed in February 2022 after regulatory approval. In September 2022, Meta acquired Lofelt, a Berlin-based haptic tech startup. In December 2025, it was announced Meta had acquired the AI-wearables startup, Limitless. In the same month, they also acquired another AI startup, Manus AI, for $2 billion. Manus announced in December that its platform had achieved $100mm in recurring revenue just 8 months after its launch and Meta said it will scale the platform to many other businesses. In January 2026, it was announced Meta proposed acquisition of Manus was undergoing preliminary scrutiny by Chinese regulators. The examination concerns the cross-border transfer of artificial intelligence technology developed in China. Lobbying In 2020, Facebook, Inc. spent $19.7 million on lobbying, hiring 79 lobbyists. In 2019, it had spent $16.7 million on lobbying and had a team of 71 lobbyists, up from $12.6 million and 51 lobbyists in 2018. Facebook was the largest spender of lobbying money among the Big Tech companies in 2020. The lobbying team includes top congressional aide John Branscome, who was hired in September 2021, to help the company fend off threats from Democratic lawmakers and the Biden administration. In December 2024, Meta donated $1 million to the inauguration fund for then-President-elect Donald Trump. In 2025, Meta was listed among the donors funding the construction of the White House State Ballroom. Partnerships February 2026, Meta announced a long-term partnership with Nvidia. Censorship In August 2024, Mark Zuckerberg sent a letter to Jim Jordan indicating that during the COVID-19 pandemic the Biden administration repeatedly asked Meta to limit certain COVID-19 content, including humor and satire, on Facebook and Instagram. In 2016 Meta hired Jordana Cutler, formerly an employee at the Israeli Embassy to the United States, as its policy chief for Israel and the Jewish Diaspora. In this role, Cutler pushed for the censorship of accounts belonging to Students for Justice in Palestine chapters in the United States. Critics have said that Cutler's position gives the Israeli government an undue influence over Meta policy, and that few countries have such high levels of contact with Meta policymakers. Following the election of Donald Trump in 2025, various sources noted possible censorship related to the Democratic Party on Instagram and other Meta platforms. In February 2025, a Meta rep flagged journalist Gil Duran's article and other "critiques of tech industry figures" as spam or sensitive content, limiting their reach. In March 2025, Meta attempted to block former employee Sarah Wynn-Williams from promoting or further distributing her memoir, Careless People, that includes allegations of unaddressed sexual harassment in the workplace by senior executives. The New York Times reports that the arbitration is among Meta's most forcible attempts to repudiate a former employee's account of workplace dynamics. Publisher Macmillan reacted to the ruling by the Emergency International Arbitral Tribunal by stating that it will ignore its provisions. As of 15 March 2025[update], hardback and digital versions of Careless People were being offered for sale by major online retailers. From October 2025, Meta began removing and restricting access for accounts related to LGBTQ, reproductive health and abortion information pages on its platforms. Martha Dimitratou, executive director of Repro Uncensored, called Meta's shadow-banning of these issues "One of the biggest waves of censorship we are seeing". Disinformation concerns Since its inception, Meta has been accused of being a host for fake news and misinformation. In the wake of the 2016 United States presidential election, Zuckerberg began to take steps to eliminate the prevalence of fake news, as the platform had been criticized for its potential influence on the outcome of the election. The company initially partnered with ABC News, the Associated Press, FactCheck.org, Snopes and PolitiFact for its fact-checking initiative; as of 2018, it had over 40 fact-checking partners across the world, including The Weekly Standard. A May 2017 review by The Guardian found that the platform's fact-checking initiatives of partnering with third-party fact-checkers and publicly flagging fake news were regularly ineffective, and appeared to be having minimal impact in some cases. In 2018, journalists working as fact-checkers for the company criticized the partnership, stating that it had produced minimal results and that the company had ignored their concerns. In 2024 Meta's decision to continue to disseminate a falsified video of US president Joe Biden, even after it had been proven to be fake, attracted criticism and concern. In January 2025, Meta ended its use of third-party fact-checkers in favor of a user-run community notes system similar to the one used on X. While Zuckerberg supported these changes, saying that the amount of censorship on the platform was excessive, the decision received criticism by fact-checking institutions, stating that the changes would make it more difficult for users to identify misinformation. Meta also faced criticism for weakening its policies on hate speech that were designed to protect minorities and LGBTQ+ individuals from bullying and discrimination. While moving its content review teams from California to Texas, Meta changed their hateful conduct policy to eliminate restrictions on anti-LGBT and anti-immigrant hate speech, as well as explicitly allowing users to accuse LGBT people of being mentally ill or abnormal based on their sexual orientation or gender identity. In January 2025, Meta faced significant criticism for its role in removing LGBTQ+ content from its platforms, amid its broader efforts to address anti-LGBTQ+ hate speech. The removal of LGBTQ+ themes was noted as part of the wider crackdown on content deemed to violate its community guidelines. Meta's content moderation policies, which were designed to combat harmful speech and protect users from discrimination, inadvertently led to the removal or restriction of LGBTQ+ content, particularly posts highlighting LGBTQ+ identities, support, or political issues. According to reports, LGBTQ+ posts, including those that simply celebrated pride or advocated for LGBTQ+ rights, were flagged and removed for reasons that some critics argue were vague or inconsistently applied. Many LGBTQ+ activists and users on Meta's platforms expressed concern that such actions stifled visibility and expression, potentially isolating LGBTQ+ individuals and communities, especially in spaces that were historically important for outreach and support. Lawsuits Numerous lawsuits have been filed against the company, both when it was known as Facebook, Inc., and as Meta Platforms. In March 2020, the Office of the Australian Information Commissioner (OAIC) sued Facebook, for significant and persistent infringements of the rule on privacy involving the Cambridge Analytica fiasco. Every violation of the Privacy Act is subject to a theoretical cumulative liability of $1.7 million. The OAIC estimated that a total of 311,127 Australians had been exposed. On December 8, 2020, the U.S. Federal Trade Commission and 46 states (excluding Alabama, Georgia, South Carolina, and South Dakota), the District of Columbia and the territory of Guam, launched Federal Trade Commission v. Facebook as an antitrust lawsuit against Facebook. The lawsuit concerns Facebook's acquisition of two competitors—Instagram and WhatsApp—and the ensuing monopolistic situation. FTC alleges that Facebook holds monopolistic power in the U.S. social networking market and seeks to force the company to divest from Instagram and WhatsApp to break up the conglomerate. William Kovacic, a former chairman of the Federal Trade Commission, argued the case will be difficult to win as it would require the government to create a counterfactual argument of an internet where the Facebook-WhatsApp-Instagram entity did not exist, and prove that harmed competition or consumers. In November 2025, it was ruled that Meta did not violate antitrust laws and holds no monopoly in the market. On December 24, 2021, a court in Russia fined Meta for $27 million after the company declined to remove unspecified banned content. The fine was reportedly tied to the company's annual revenue in the country. In May 2022, a lawsuit was filed in Kenya against Meta and its local outsourcing company Sama. Allegedly, Meta has poor working conditions in Kenya for workers moderating Facebook posts. According to the lawsuit, 260 screeners were declared redundant with confusing reasoning. The lawsuit seeks financial compensation and an order that outsourced moderators be given the same health benefits and pay scale as Meta employees. In June 2022, 8 lawsuits were filed across the U.S. over the allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues. The litigation follows a former Facebook employee's testimony in Congress that the company refused to take responsibility. The company noted that tools have been developed for parents to keep track of their children's activity on Instagram and set time limits, in addition to Meta's "Take a break" reminders. In addition, the company is providing resources specific to eating disorders as well as developing AI to prevent children under the age of 13 signing up for Facebook or Instagram. In June 2022, Meta settled a lawsuit with the US Department of Justice. The lawsuit, which was filed in 2019, alleged that the company enabled housing discrimination through targeted advertising, as it allowed homeowners and landlords to run housing ads excluding people based on sex, race, religion, and other characteristics. The U.S. Department of Justice stated that this was in violation of the Fair Housing Act. Meta was handed a penalty of $115,054 and given until December 31, 2022, to shadow the algorithm tool. In January 2023, Meta was fined €390 million for violations of the European Union General Data Protection Regulation. In May 2023, the European Data Protection Board fined Meta a record €1.2 billion for breaching European Union data privacy laws by transferring personal data of Facebook users to servers in the U.S. In July 2024, Meta agreed to pay the state of Texas US$1.4 billion to settle a lawsuit brought by Texas Attorney General Ken Paxton accusing the company of collecting users' biometric data without consent, setting a record for the largest privacy-related settlement ever obtained by a state attorney general. In October 2024, Meta Platforms faced lawsuits in Japan from 30 plaintiffs who claimed they were defrauded by fake investment ads on Facebook and Instagram, featuring false celebrity endorsements. The plaintiffs are seeking approximately $2.8 million in damages. In April 2025, the Kenyan High Court ruled that a US$2.4 billion lawsuit in which three plaintiffs claim that Facebook inflamed civil violence in Ethiopia in 2021 could proceed. In April 2025, Meta was fined €200 million ($230 million) for breaking the Digital Markets Act, by imposing a “consent or pay” system that forces users to either allow their personal data to be used to target advertisements, or pay a subscription fee for advertising-free versions of Facebook and Instagram. In late April 2025, a case was filed against Meta in Ghana over the alleged psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse. Meta moved the moderation service to the Ghanaian capital of Accra after legal issues in the previous location Kenya. The new moderation company is Teleperformance, a multinational corporation with a history of worker's rights violation. Reports suggests the conditions are worse here than in the previous Kenyan location, with many workers afraid of speaking out due to fear of returning to conflict zones. Workers reported developing mental illnesses, attempted suicides, and low pay. In 26 January 2026, a New Mexico state court case was filed, suggesting that Mark Zuckerberg approved allowing minors to access artificial intelligence chatbot companions that safety staffers warned were capable of sexual interactions. In 2020, the company UReputation, which had been involved in several cases concerning the management of digital armies[clarification needed], filed a lawsuit against Facebook, accusing it of unlawfully transmitting personal data to third parties. Legal actions were initiated in Tunisia, France, and the United States. In 2025, the United States District court for the Northern District of Georgia approved a discovery procedure, allowing UReputation to access documents and evidence held by Meta. Structure Meta's key management consists of: As of October 2022[update], Meta had 83,553 employees worldwide. As of June 2024[update], Meta's board consisted of the following directors; Meta Platforms is mainly owned by institutional investors, who hold around 80% of all shares. Insiders control the majority of voting shares. The three largest individual investors in 2024 were Mark Zuckerberg, Sheryl Sandberg and Christopher K. Cox. The largest shareholders in late 2024/early 2025 were: Roger McNamee, an early Facebook investor and Zuckerberg's former mentor, said Facebook had "the most centralized decision-making structure I have ever encountered in a large company". Facebook co-founder Chris Hughes has stated that chief executive officer Mark Zuckerberg has too much power, that the company is now a monopoly, and that, as a result, it should be split into multiple smaller companies. In an op-ed in The New York Times, Hughes said he was concerned that Zuckerberg had surrounded himself with a team that did not challenge him, and that it is the U.S. government's job to hold him accountable and curb his "unchecked power". He also said that "Mark's power is unprecedented and un-American." Several U.S. politicians agreed with Hughes. European Union Commissioner for Competition Margrethe Vestager stated that splitting Facebook should be done only as "a remedy of the very last resort", and that it would not solve Facebook's underlying problems. Revenue Facebook ranked No. 34 in the 2020 Fortune 500 list of the largest United States corporations by revenue, with almost $86 billion in revenue most of it coming from advertising. One analysis of 2017 data determined that the company earned US$20.21 per user from advertising. According to New York, since its rebranding, Meta has reportedly lost $500 billion as a result of new privacy measures put in place by companies such as Apple and Google which prevents Meta from gathering users' data. In February 2015, Facebook announced it had reached two million active advertisers, with most of the gain coming from small businesses. An active advertiser was defined as an entity that had advertised on the Facebook platform in the last 28 days. In March 2016, Facebook announced it had reached three million active advertisers with more than 70% from outside the United States. Prices for advertising follow a variable pricing model based on auctioning ad placements, and potential engagement levels of the advertisement itself. Similar to other online advertising platforms like Google and Twitter, targeting of advertisements is one of the chief merits of digital advertising compared to traditional media. Marketing on Meta is employed through two methods based on the viewing habits, likes and shares, and purchasing data of the audience, namely targeted audiences and "look alike" audiences. The U.S. IRS challenged the valuation Facebook used when it transferred IP from the U.S. to Facebook Ireland (now Meta Platforms Ireland) in 2010 (which Facebook Ireland then revalued higher before charging out), as it was building its double Irish tax structure. The case is ongoing and Meta faces a potential fine of $3–5bn. The U.S. Tax Cuts and Jobs Act of 2017 changed Facebook's global tax calculations. Meta Platforms Ireland is subject to the U.S. GILTI tax of 10.5% on global intangible profits (i.e. Irish profits). On the basis that Meta Platforms Ireland Limited is paying some tax, the effective minimum US tax for Facebook Ireland will be circa 11%. In contrast, Meta Platforms Inc. would incur a special IP tax rate of 13.125% (the FDII rate) if its Irish business relocated to the U.S. Tax relief in the U.S. (21% vs. Irish at the GILTI rate) and accelerated capital expensing, would make this effective U.S. rate around 12%. The insignificance of the U.S./Irish tax difference was demonstrated when Facebook moved 1.5bn non-EU accounts to the U.S. to limit exposure to GDPR. Facilities Users outside of the U.S. and Canada contract with Meta's Irish subsidiary, Meta Platforms Ireland Limited (formerly Facebook Ireland Limited), allowing Meta to avoid US taxes for all users in Europe, Asia, Australia, Africa and South America. Meta is making use of the Double Irish arrangement which allows it to pay 2–3% corporation tax on all international revenue. In 2010, Facebook opened its fourth office, in Hyderabad, India, which houses online advertising and developer support teams and provides support to users and advertisers. In India, Meta is registered as Facebook India Online Services Pvt Ltd. It also has offices or planned sites in Chittagong, Bangladesh; Dublin, Ireland; and Austin, Texas, among other cities. Facebook opened its London headquarters in 2017 in Fitzrovia in central London. Facebook opened an office in Cambridge, Massachusetts in 2018. The offices were initially home to the "Connectivity Lab", a group focused on bringing Internet access to those who do not have access to the Internet. In April 2019, Facebook opened its Taiwan headquarters in Taipei. In March 2022, Meta opened new regional headquarters in Dubai. In September 2023, it was reported that Meta had paid £149m to British Land to break the lease on Triton Square London office. Meta reportedly had another 18 years left on its lease on the site. As of 2023, Facebook operated 21 data centers. It committed to purchase 100% renewable energy and to reduce its greenhouse gas emissions 75% by 2020. Its data center technologies include Fabric Aggregator, a distributed network system that accommodates larger regions and varied traffic patterns. Reception US Representative Alexandria Ocasio-Cortez responded in a tweet to Zuckerberg's announcement about Meta, saying: "Meta as in 'we are a cancer to democracy metastasizing into a global surveillance and propaganda machine for boosting authoritarian regimes and destroying civil society ... for profit!'" Ex-Facebook employee Frances Haugen and whistleblower behind the Facebook Papers responded to the rebranding efforts by expressing doubts about the company's ability to improve while led by Mark Zuckerberg, and urged the chief executive officer to resign. In November 2021, a video published by Inspired by Iceland went viral, in which a Zuckerberg look-alike promoted the Icelandverse, a place of "enhanced actual reality without silly looking headsets". In a December 2021 interview, SpaceX and Tesla chief executive officer Elon Musk said he could not see a compelling use-case for the VR-driven metaverse, adding: "I don't see someone strapping a frigging screen to their face all day." In January 2022, Louise Eccles of The Sunday Times logged into the metaverse with the intention of making a video guide. She wrote: Initially, my experience with the Oculus went well. I attended work meetings as an avatar and tried an exercise class set in the streets of Paris. The headset enabled me to feel the thrill of carving down mountains on a snowboard and the adrenaline rush of climbing a mountain without ropes. Yet switching to the social apps, where you mingle with strangers also using VR headsets, it was at times predatory and vile. Eccles described being sexually harassed by another user, as well as "accents from all over the world, American, Indian, English, Australian, using racist, sexist, homophobic and transphobic language". She also encountered users as young as 7 years old on the platform, despite Oculus headsets being intended for users over 13. See also References External links 37°29′06″N 122°08′54″W / 37.48500°N 122.14833°W / 37.48500; -122.14833
========================================
[SOURCE: https://en.wikipedia.org/wiki/Alien_intelligence] | [TOKENS: 1352]
Contents Extraterrestrial intelligence Extraterrestrial intelligence (ETI) refers to hypothetical beings – extraterrestrial life or more artificial beings – having the intelligence to perform similar cognitive abilities as humans. No such life has ever been verifiably observed to exist. The question of whether other inhabited worlds might exist has been debated since ancient history. The modern form of the concept emerged when the Copernican Revolution demonstrated that the Earth was a planet revolving around the Sun, and other planets were, conversely, other worlds. The question of whether other inhabited planets or moons exist was a natural consequence of this new understanding. It has become one of the most speculative questions in science and is a central theme of both science fiction and popular culture. An alternative name for it is "Extraterrestrial Technological Instantiations" (ETI). The term was coined to avoid the use of terms such as "civilizations", "species", and "intelligence", as those may prove to be ambiguous and open to interpretation, or simply inapplicable in its local context. Intelligence Intelligence is, along with the more precise concept of sapience, used to describe extraterrestrial life with similar cognitive abilities as humans. Another interchangeable term is sophoncy, being wise or wiser, first coined by Karen Anderson and published in the 1966 works by her husband Poul Anderson. Sentience, like consciousness, is a concept sometimes mistakenly used to refer to the concept of intelligence and sapience, since it does not exclude forms of life that are non-sapient (or more broadly non-intelligent or non-conscious). The term extraterrestrial civilization frames a more particular case of extraterrestrial intelligence. It is the possible long-term result of intelligent and specifically sapient extraterrestrial life. Probability The Copernican principle is generalized to the relativistic concept that humans are not privileged observers of the universe. Many prominent scientists, including Stephen Hawking have proposed that the sheer scale of the universe makes it improbable for intelligent life not to have emerged elsewhere. However, Fermi's Paradox highlights the apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilization and humanity's lack of contact with, or evidence for, such civilizations. So far, there is no observation of extraterrestrial life, including intelligent extraterrestrial life. The Kardashev scale is a speculative method of measuring a civilization's level of technological advancement, based on the amount of energy a civilization is able to utilize. The Drake equation is a probabilistic framework used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy. A 2020 study estimated that there could be about 36 alien civilizations in the Milky Way today. With 36 "communicating extra-terrestrial intelligent" civilizations, the average distance would be around 17,000 light-years. However, it is conceivable that some civilizations have created machines, probes or similar systems that outlast them. Search for extraterrestrial intelligence There has been a search for signals from extraterrestrial intelligence for several decades, with no significant results. Active SETI (Active Search for Extra-Terrestrial Intelligence) is the attempt to send messages to intelligent extraterrestrial life. Active SETI messages are usually sent in the form of radio signals. Physical messages like that of the Pioneer plaque may also be considered an active SETI message. Communication with extraterrestrial intelligence (CETI) is a branch of the search for extraterrestrial intelligence that focuses on composing and deciphering messages that could theoretically be understood by another technological civilization. The best-known CETI experiment was the 1974 Arecibo message composed by Frank Drake and Carl Sagan. There are multiple independent organizations and individuals engaged in CETI research. The U.S. government's position, in line with that of most relevant experts, is that "chances of contact with an extraterrestrial intelligence are extremely small, given the distances involved." This line of thinking has led some to conclude that first contact might be made with extraterrestrial artificial intelligence, rather than with biological beings. The Wow! signal remains the best candidate for an extraterrestrial radio signal ever detected, though the fact that no similar signal has ever been observed again makes attribution of the signal to any cause difficult if not impossible. On 14 June 2022 astronomers working with China's FAST telescope reported the possibility of having detected artificial (presumably alien) signals, but cautions that further studies are required to determine if some kind of natural radio interference may be the source. On 18 June 2022 Dan Werthimer, chief scientist for several SETI-related projects, reportedly noted that “These signals are from radio interference; they are due to radio pollution from earthlings, not from E.T.” Potential cultural impact of extraterrestrial contact The potential changes from extraterrestrial contact could vary greatly in magnitude and type, based on the extraterrestrial civilization's level of technological advancement, degree of benevolence or malevolence, and level of mutual comprehension between itself and humanity. Some theories suggest that an extraterrestrial civilization could be advanced enough to dispense with biology, living instead inside of advanced computers. The medium through which humanity is contacted, be it electromagnetic radiation, direct physical interaction, extraterrestrial artefact, or otherwise, may also influence the results of contact. Incorporating these factors, various systems have been created to assess the implications of extraterrestrial contact. The implications of extraterrestrial contact, particularly with a technologically superior civilization, have often been likened to the meeting of two vastly different human cultures on Earth, a historical precedent being the Columbian Exchange. Such meetings have generally led to the destruction of the civilization receiving contact (as opposed to the "contactor", which initiates contact), and therefore destruction of human civilization is a possible outcome. However, the absence of any such contact to date means such conjecture is largely speculative. UFOlogy The extraterrestrial hypothesis is the idea that some UFOs are vehicles containing or sent by extraterrestrial beings (usually called aliens in this context). As an explanation for UFOs, ETI is sometimes contrasted with EDI (extradimensional intelligence), for example by J. Allen Hynek. In 2023, United States House of Representatives lawmakers held a hearing to examine how the American executive branch handles reports of UFOs. In culture The theories and reception of the probability of intelligent life has been a recurring cultural element, especially of popular culture since the prospect and achievement of spaceflight. New Mexico has even declared in 2003 the 14th of February as the Extraterrestrial Culture Day. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Template:General_relativity_sidebar] | [TOKENS: 53]
Contents Template:General relativity sidebar TemplateData for General relativity sidebar Sidebar template for use at or near the top of articles. Template parameters[Edit template data] Set in the manner described on this documentation page. Set in the manner described on this documentation page.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Puck_(folklore)] | [TOKENS: 1208]
Contents Puck (folklore) In English folklore, The Puck (/ˈpʌk/), also known as Goodfellows, are demons or fairies which can be domestic sprites or nature sprites. Origins and comparative folklore The etymology of puck is uncertain. The present-day English word is attested already in Old English as puca (with a diminutive form pucel). Similar words are attested later in Old Norse (púki, with related forms including Old Swedish puke, Icelandic púki, and Frisian puk) but also in the Celtic languages (Welsh pwca, Cornish bucca and Irish púca). Most commentators think that the word was borrowed from one of these neighbouring north-west European languages into the others, but it is not certain in what direction the borrowing went, and all vectors have been proposed by scholars. The Oxford English Dictionary favoured a Scandinavian origin, while the scholarly study by Erin Sebo of Flinders University argues for an Irish origin, on the basis that the word is widely distributed in Irish place-names, whereas puck-place-names in English are rare and late in the areas showing Old Norse influence, and seem rather to radiate outwards from South West England, which she argues had Irish influence during the Early Middle Ages. Puck may also be called The Goodfellows or Hobgoblin, in which Hob may substitute for Rob or Robin. This goes back to the character "Robin Goodfellow" and his name. The name Robin is Middle English in origin, deriving from Old French Robin, the pet form for the name Robert. Similar to the use of "the good folk" in describing fairies, it reflected a degree of wishful thinking and an attempt to appease the fairies, recognizing their fondness of flattery despite their mischievous nature. The earliest reference to "Robin Goodfellow" cited by the Oxford English Dictionary is from 1531. Anthony Munday mentions Robin Goodfellow in his play The Two Italian Gentlemen, 1584, and he appears in Skialtheia, or a Shadowe of Truth in 1598. William Shakespeare may have had access to the manuscript of Lewes Lewkenor's translation of The Spanish Mandevile of Miracles, or, The Garden of Curious Flowers (1600), a translation of Antonio de Torquemada's Jardín de Flores Curiosas. The following passage from The Spanish Mandeville discusses the mischievous spirits: Luduvico: I pray you let me somewhat understand your opinion as concerning Robingoodfellowes and Hobgoblins, which are said to be so common, that there is scarcely any man but will tell you one tale or other of them, of which for my own part, I believe none, but do make reckoning that every man forgeth herein, what pleaseth him. Antonio: Many of them without doubt are forged, and many also true, for these kinds of Spirits are more familiar and domestical than the others, and for some causes to us unknown, abide in one place, more than in another, so that some never almost depart from some particular houses, as though they were their proper mansions, making in them sundry noises, rumours, mockeries, gawdes and jests, without doing any harm at all: and though I am not myself witness thereof, yet I have heard many persons of credit affirm that they have heard them play as it were on Gyterns & Jews Harps, and ring Bells, and that they answer to those that call them, and speak with certain signs, laughters and merry gestures, so that those of the house come at last to be so familiar and well acquainted with them that they fear them not at all. But in truth, as I said before, if they had free power to put in practice their malicious desire, we should find these pranks of theirs, not to be jests, but earnest indeed, tending to the destruction of both our body and soul, but as I told you before, this power of theirs is so restrained and tied, that they can pass no farther than to jests and gawdes: and if they do any harm or hurt at all, it is certain very little, as by experience we daily see. After Giacomo Meyerbeer's successful opera Robert le Diable (1831), neo-medievalists and occultists began to apply the name Robin Goodfellow to the Devil, with appropriately extravagant imagery.[citation needed] Bwbach or, diminuitive, bwrbachyn, has also been used in Welsh Mythology . Characteristics According to Brewer's Dictionary of Phrase and Fable (1898): [Robin Goodfellow is a] "drudging fiend", and merry domestic fairy, famous for mischievous pranks and practical jokes. At night-time he will sometimes do little services for the family over which he presides. The Scots call this domestic spirit a brownie; the Germans, Kobold or Knecht Ruprecht. Scandinavians called it Nissë God-dreng. Puck, the jester of Fairy-court, is the same. Puck might do minor housework, quick fine needlework or butter-churning, which could be undone in a moment by his knavish tricks if displeased. A domestic spirit, he would assist housewives with their chores, in expectation of an offering of white bread and milk. If this were neglected he would steal that which he believed was owed. Pucks are also known to be inherently solitary creatures. Shakespeare's characterization of "shrewd and knavish" Puck in A Midsummer Night's Dream may have revived flagging interest in Puck. Notable cultural references This list excludes Shakespearean references. They may be found at Puck (A Midsummer Night's Dream). See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Tiberias] | [TOKENS: 7818]
Contents Tiberias Tiberias (/taɪˈbɪəriəs/ ty-BEER-ee-əs; Hebrew: טבריה, Ṭəveryāⓘ; Arabic: طبريا, romanized: Ṭabariyyā) is a city on the western shore of the Sea of Galilee in northern Israel. A major Jewish center during Late Antiquity, it has been considered since the 18th century one of Judaism's Four Holy Cities, along with Jerusalem, Hebron, and Safed. In 2023 it had a population of 51,476. Tiberias was founded around 20 CE by Herod Antipas and was named after Roman emperor Tiberius. It became a major political and religious hub of the Jews in the Land of Israel after the destruction of Jerusalem and the desolation of Judea during the Jewish–Roman wars. From the time of the second through the tenth centuries CE, Tiberias was the largest Jewish city in Galilee, and much of the Mishna and the Jerusalem Talmud were compiled there. Tiberias flourished during the Early Muslim period, when it served as the capital of Jund al-Urdunn and became a multi-cultural trading center. The city declined in importance over time due to earthquake damage and foreign incursions. After the Galilee earthquake of 1837 the city was rebuilt and grew steadily following the First Jewish Aliyah in the 1880s. In early modern times, Tiberias was a mixed city; under British rule it had a majority Jewish population, but with a significant Arab community. During the 1947–1948 civil war in Mandatory Palestine, fighting broke out between the Jewish residents of Tiberias and its Palestinian Arab minority. As the Haganah took over, British troops evacuated the entire Palestinian Arab population; they were refused reentry after the war, such that today the city has an almost exclusively Jewish population. After the war ended, the new Israeli authorities destroyed the Old City of Tiberias. A large number of Jewish immigrants to Israel subsequently settled in Tiberias. Today, Tiberias is an important tourist center due to its proximity to the Sea of Galilee and religious sanctity to Judaism and Christianity. The city also serves as a regional industrial and commercial center. Its immediate neighbour to the south, Hammat Tiberias, which is now part of modern Tiberias, has been known for its hot springs, believed to cure skin and other ailments, for some two thousand years. Etymology The city of Tiberias was named after the Roman Emperor Tiberius. A Midrash regarding the city's name appears in the Babylonian Talmud, Tractate Megillah 6A: "Rabbi Yirmiya said: [...] And why is it called Tiberias? Because it sits at the navel (tabur) of the Land of Israel". Rashi, in his commentary, provides two interpretations of the Midrash. The first explains the word "navel" (tabur) as referring to the lowest point in the Land of Israel. The second interpretation suggests that Tiberias is situated at the geographical center of the Land of Israel. Ishtori Haparchi later substantiated this claim through measurements of the entire land, demonstrating its accuracy. History Jewish tradition holds that Tiberias was built on the site of the ancient Israelite village of Rakkath or Rakkat, first mentioned in the Book of Joshua. In Talmudic times, the Jews still referred to it by this name. Tiberias was founded sometime around 18–20 CE in the Herodian Tetrarchy of Galilee and Perea by the Roman client king Herod Antipas, son of Herod the Great. Herod Antipas made it the capital of his realm in Galilee and named it after the Roman emperor Tiberius. The city was built in immediate proximity to a spa which had developed around seventeen natural mineral hot springs, Hammat Tiberias. Tiberias was at first a strictly pagan city, but later became populated mainly by Jews, with its growing spiritual and religious status exerting a strong influence on balneological practices.[dubious – discuss] Conversely, in Antiquities of the Jews, the Roman-Jewish historian Josephus calls the village with hot springs Emmaus, today's Hammat Tiberias, located near Tiberias.[citation needed] This name also appears in his work The Jewish War. Under the Roman Empire, the city was known by its Koine Greek name Τιβεριάς (Tiberiás, Greek: Τιβεριάδα, romanized: Tiveriáda).[citation needed] In the days of Herod Antipas, some of the most religiously orthodox Jews, who were struggling against the process of Hellenisation, which had affected even some priestly groups, refused to settle there: the presence of a cemetery rendered the site ritually unclean for the Jews and particularly for the priestly caste. Antipas settled many non-Jews there from rural Galilee and other parts of his domains in order to populate his new capital, and built a palace on the acropolis.[dubious – discuss] The prestige of Tiberias was so great that the Sea of Galilee soon came to be named the Sea of Tiberias; however, the Jewish population continued to call it Yam HaKineret, its traditional name. The city was governed by a city council of 600 with a committee of ten until 44 CE, when a Roman procurator was set over the city after the death of Herod Agrippa I. The city is estimated to have had a population between 4,500 and 15,000 during the first century CE. Tiberias is mentioned in John 6:23 as the location from which boats had sailed to the opposite, eastern side of the Sea of Galilee. The crowd seeking Jesus after the miraculous feeding of the 5000 used these boats to travel back to Capernaum on the north-western part of the lake. In 61 CE Herod Agrippa II annexed the city to his kingdom whose capital was Caesarea Philippi.[citation needed] During the First Jewish–Roman War, the Jewish rebels took control of the city and destroyed Herod's palace, and were able to prevent the city from being pillaged by the army of Agrippa II, the Jewish ruler who had remained loyal to Rome. Eventually, the rebels were expelled from Tiberias, and while most other cities in the provinces of Judaea, Galilee and Idumea were razed, Tiberias was spared this fate because its inhabitants had decided not to fight against Rome. It became a mixed city after the fall of Jerusalem in 70 CE; with Judea subdued, the surviving southern Jewish population migrated to Galilee. There is no direct indication that Tiberias, as well as the rest of Galilee, took part in the Bar Kokhba revolt of 132–136 CE, thus allowing it to continue to exist, despite a heavy economic decline due to the war. Following the expulsion of Jews from Judea after 135 CE, Tiberias and its neighbour Sepphoris (Hebrew name: Tzippori) became the major Jewish cultural centres. According to the Talmud, in 145 CE, Rabbi Simeon bar Yochai, who was very familiar with Galilee, hiding there for over a decade, "cleansed the city of ritual impurity",[citation needed] allowing the Jewish leadership to resettle there from the Judea, which they were forced to leave as fugitives. The Sanhedrin, the Jewish court, also fled from Jerusalem during the Great Jewish Revolt against Rome, and after several attempted moves, in search of stability, eventually settled in Tiberias in about 220 CE. It was to be its final meeting place before its disbanding in 425 CE. When Johanan bar Nappaha (d. 279) settled in Tiberias, the city became the focus of Jewish religious scholarship in the land and the so-named Jerusalem Talmud was compiled by his school in Tiberias between 230–270 CE. Tiberias' 13 synagogues served the spiritual needs of a growing Jewish population. Tombs of famous rabbis Yohanan ben Zakkai, Akiva and Maimonides are also located in the city. During the Byzantine period, Tiberias was incorporated into the province of Palaestina Secunda. In the 6th century Tiberias was still the seat of Jewish religious learning. In light of this, the Letter of Simeon of Beth Arsham urged the Christians of Palaestina to seize the leaders of Judaism in Tiberias, to put them to the rack, and to compel them to command the Jewish king, Dhu Nuwas, to desist from persecuting the Christians in Najran. In 614, Tiberias was the site where, during the final Jewish revolt against the Byzantine Empire, parts of the Jewish population supported the Persian invaders; the Jewish rebels were financed by Benjamin of Tiberias, a man of immense wealth; according to Christian sources, during the revolt Christians were massacred and churches destroyed. In 628, the Byzantine army returned to Tiberias upon the surrender of Jewish rebels and the end of the Persian occupation after they were defeated in the battle of Nineveh. A year later, influenced by radical Christian monks, Emperor Heraclius instigated a wide-scale slaughter of the Jews, which practically emptied Galilee of most its Jewish population, with survivors fleeing to Egypt.[citation needed] Tiberias, or Tabariyyah in Arab transcription, was "conquered by (the Arab commander) Shurahbil in the year 634/15 [CE/AH] by capitulation; one half of the houses and churches were to belong to the Muslims, the other half to the Christians." Since 636 CE, Tiberias served as the regional capital, until Beit She'an took its place, following the Rashidun conquest.[clarification needed] The Caliphate allowed 70 Jewish families from Tiberias to form the core of a renewed Jewish presence in Jerusalem and the importance of Tiberias to Jewish life declined.[citation needed] The caliphs of the Umayyad Dynasty built one of its square-plan palaces on the waterfront to the north of Tiberias, at Khirbat al-Minya. Tiberias was revitalised in 749, after Bet Shean was destroyed in an earthquake.[citation needed] An imposing mosque, 90 metres (300 feet) long by 78 metres (256 feet) wide, resembling the Great Mosque of Damascus, was raised at the foot of Mount Berenice next to a Byzantine church, to the south of the city, as the eighth century ushered in Tiberias's golden age, when the multicultural city may have been the most tolerant of the Middle East. Jewish scholarship flourished from the beginning of the 8th century to the end of the 10th, when the oral traditions of ancient Hebrew, still in use today, were codified. One of the leading members of the Tiberian Masoretic community was Aaron ben Moses ben Asher, who refined the oral tradition now known as Tiberian Hebrew. Both the Codex Cairensis and the Aleppo Codex were written in Tiberias as well as the Tiberian vocalization was devised here. The Arab geographer al-Muqaddasi writing in 985, describes Tiberias as a hedonistic city afflicted by heat: "For two months they dance; for two months they gobble; for two months they swat; for two months they go about naked; for two months they play the reed flute; and for two months they wallow in the mud." As "the capital of Jordan Province, and a city in the Valley of Canaan. ... The town is narrow, hot in summer and unhealthy...There are here eight natural hot baths, where no fuel need be used, and numberless basins besides of boiling water. The mosque is large and fine, and stands in the market-place. Its floor is laid in pebbles, set on stone drums, placed close one to another." According to Muqaddasi, those who suffered from scab or ulcers, and other such diseases came to Tiberias to bathe in the hot springs for three days. "Afterwards they dip in another spring which is cold, whereupon ... they become cured." Tiberias was plagued by incursions by the radical Shi'ite Qarmatians at the beginning of the tenth century. During that period, the Academy of Eretz Israel left Tiberias for Jerusalem. Later in the same century, the region came under the control by the Fatimid Caliphate. By this time, Tiberias had experienced its last period of prosperity; dried fruit, oil, and wine had been exported to Cairo via the Via Maris, and the city was also known for its mat industry. In 1033 Tiberias was again destroyed by an earthquake.[citation needed] A further earthquake in 1066 toppled the great mosque. Nasir-i Khusrou visited Tiberias in 1047, and describes a city with a "strong wall" which begins at the border of the lake and goes all around the town except on the water-side. Furthermore, he describes numberless buildings erected in the very water, for the bed of the lake in this part is rock; and they have built pleasure houses that are supported on columns of marble, rising up out of the water. The lake is very full of fish. [] The Friday Mosque is in the midst of the town. At the gate of the mosque is a spring, over which they have built a hot bath. [] On the western side of the town is a mosque known as the Jasmine Mosque (Masjid-i-Yasmin). It is a fine building and in the middle part rises a great platform (dukkan), where they have their mihrabs (or prayer-niches). All round those they have set jasmine-shrubs, from which the mosque derives its name. During the First Crusade Tiberias was occupied by the Franks soon after the capture of Jerusalem. The city was given in fief to Tancred, who made it his capital of the Principality of Galilee in the Kingdom of Jerusalem; the region was sometimes called the Principality of Tiberias, or the Tiberiad. In 1099 the original site of the city was abandoned, and settlement shifted north to the present location.[citation needed] St. Peter's Church, originally built by the Crusaders, is still standing today, although the building has been altered and reconstructed over the years. In the late 12th century Tiberias' Jewish community numbered 50 Jewish families, headed by rabbis, and at that time the best manuscripts of the Torah were said to be found there. In the 12th-century, the city was the subject of negative undertones in Islamic tradition. A hadith recorded by Ibn Asakir of Damascus (d. 1176) names Tiberias as one of the "four cities of hell." This could have been reflecting the fact that at the time, the town had a notable non-Muslim population. In 1187, Saladin ordered his son al-Afdal to send an envoy to Count Raymond of Tripoli requesting safe passage through his fiefdom of Galilee and Tiberias. Raymond was obliged to grant the request under the terms of his treaty with Saladin. Saladin's force left Caesarea Philippi to engage the fighting force of the Knights Templar. The Templar force was destroyed in the encounter. Saladin then besieged Tiberias; after six days the town fell. On 4 July 1187 Saladin defeated the Crusaders coming to relieve Tiberias at the Battle of Hattin, 10 kilometres (6 miles) outside the city. However, during the Third Crusade, the Crusaders drove the Muslims out of the city and reoccupied it. Rabbi Moshe ben Maimon, (Maimonides) also known as Rambam, a leading Jewish legal scholar, philosopher and physician of his period, died in 1204 in Egypt and was later buried in Tiberias. His tomb is one of the city's important pilgrimage sites. Yakut, writing in the 1220s, described Tiberias as a small town, long and narrow. He also describes the "hot salt springs, over which they have built Hammams which use no fuel." In 1265 the Crusaders were driven from the city by the Egyptian Mamluks, who ruled Tiberias until the Ottoman conquest in 1516.[citation needed] During the 16th century, Tiberias was a small village. Italian Rabbi Moses Bassola visited Tiberias during his trip to Palestine in 1522. He said on Tiberias that "it was a big city ... and now it is ruined and desolate". He described the village there, in which he said there were "ten or twelve" Muslim households. The area, according to Bassola, was dangerous "because of the Arabs", and in order to stay there, he had to pay the local governor for his protection. As the Ottoman Empire expanded along the southern Mediterranean coast under Sultan Selim I, the Reyes Católicos (Catholic Monarchs) began establishing Inquisition commissions. Many Conversos, (Marranos and Moriscos) and Sephardi Jews fled in fear to the Ottoman provinces, settling at first in Constantinople, Salonika, Sarajevo, Sofia and Anatolia. The Sultan encouraged them to settle in Palestine. In 1558, a Portuguese-born marrano, Doña Gracia, was granted tax collecting rights in Tiberias and its surrounding villages by Suleiman the Magnificent. She envisaged the town becoming a refuge for Jews and obtained a permit to establish Jewish autonomy there. In 1561 her nephew Joseph Nasi, Lord of Tiberias, encouraged Jews to settle in Tiberias and rebuild the city. Securing a firman from the Sultan, he and Joseph ben Adruth rebuilt the city walls and lay the groundwork for a textile (silk) industry, planting mulberry trees and urging craftsmen to move there. Plans were made for Jews to move from the Papal States, but when the Ottomans and the Republic of Venice went to war, the plan was abandoned. At the end of the century (1596), the village of Tiberias had 54 households: 50 families and 4 bachelors. All were Muslims. The main product of the village at that time was wheat, while other products included barley, fruit, fish, goats and bee hives; the total revenue was 3,360 akçe. In 1624, when the Sultan recognized Fakhr-al-Din II as Lord of Arabistan (from Aleppo to the borders of Egypt), The 1660 destruction of Tiberias by the Druze resulted in abandonment of the city by its Jewish community, Unlike Tiberias, the nearby city of Safed recovered from its destruction, and was not entirely abandoned, remaining an important Jewish center in Galilee. In the 1730s, the Arab ruler Daher al-Umar, of the Zaydani clan, fortified the town. Accounts from that time tell of the great admiration people had for Daher, especially his war against bandits on the roads. Richard Pococke, who visited Tiberias in 1737, witnessed the building of a fort to the north of the city, and the strengthening of the old walls, attributing it to a dispute with the governor of Damascus. Under instructions from the Ottoman Porte, Sulayman Pasha al-Azm of Damascus besieged Tiberias in 1742, with the intention of eliminating Daher, but his siege was unsuccessful. In the following year, Sulayman set out to repeat the attempt with even greater reinforcements, but he died en route. Under Daher's patronage, Jewish families were encouraged to settle in Tiberias. He invited Rabbi Chaim Abulafia of Smyrna to rebuild the Jewish community. The synagogue he built still stands today, located in the Court of the Jews. In 1775, Ahmed el-Jazzar "the Butcher" brought peace to the region with an iron fist.[citation needed] In 1780, many Polish Jews settled in the town. In 1834, W. E. Fitzmaurice wrote that the city suffered heavy rains and snow in winter, which caused great damage; except for the synagogue, the lower floors of all houses were flooded with water up to two or three feet. He added that most of the population were Jews from Poland, with only a few Muslims. During the 18th and 19th centuries it received an influx of rabbis who re-established it as a center for Jewish learning. An essay written by Rabbi Joseph Schwarz in 1850 noted that "Tiberias Jews suffered the least" during an Arab rebellion which took place in 1834. Around 600 people, including nearly 500 Jews, died when the town was devastated by the 1837 Galilee earthquake.[citation needed] An American expedition reported that Tiberias was still in a state of disrepair in 1847/1848. Rabbi Haim Shmuel Hacohen Konorti, born in Spain in 1792, settled in Tiberias at the age of 45 and was a driving force in the restoration of the city. In the 19th century, the Greek Orthodox Patriarchate of Jerusalem purchased lands that were incorporated into its growing network of rural estates. In the 1922 census of Palestine conducted by the British Mandate authorities, Tiberias had a population of 6,950 inhabitants, consisting of 4,427 Jews, 2,096 Muslims, 422 Christians, and five others. Initially the relationship between Arabs and Jews in Tiberias was good, with few incidents occurring in the Nebi Musa riots in 1920 and the Arab riots throughout Palestine in 1929. The first modern spa was built in 1929. The landscape of the modern town was shaped by the great flood of 11 November 1934. Deforestation on the slopes above the town combined with the fact that the city had been built as a series of closely packed houses and buildings – usually sharing walls – built in narrow roads paralleling and closely hugging the shore of the lake. Flood waters carrying mud, stones, and boulders rushed down the slopes and filled the streets and buildings with water so rapidly that many people did not have time to escape; the loss of life and property was great. The city rebuilt on the slopes and the British Mandatory government planted the Swiss Forest on the slopes above the town to hold the soil and prevent similar disasters from recurring. A new seawall was constructed, moving the shoreline several yards out from the former shore. In October 1938, Arab militants murdered 19 Jews in Tiberias during the 1936–39 Arab revolt in Palestine. Between 8–9 April 1948, sporadic shooting broke out between the Jewish and Arab neighborhoods of Tiberias. Arab Liberation Army and irregular forces attacked and closed the Rosh Pinnah road, isolating the northern Jewish settlements. On 10 April, the Haganah launched a mortar barrage, killing some Arab residents. The local National Committee refused the offer of the Arab Liberation Army to take over defense of the city, but a small contingent of outside irregulars moved in. During 10–17 April, the Haganah attacked the city and refused to negotiate a truce, while the British refused to intervene. Newly arrived Arab refugees from Nasir ad-Din told of the civilians there being killed, news which brought panic to the residents of Tiberias. The Arab population of Tiberias (6,000 residents or 47.5% of the population) was evacuated by the British forces on 18 April 1948. The Jewish population looted the Arab areas and had to be suppressed by force by the Haganah and Jewish police, who killed or injured several looters. On 30 December 1948, when David Ben-Gurion was staying in Tiberias, James Grover McDonald, the United States ambassador to Israel, requested to meet with him. McDonald presented a British ultimatum for Israeli troops to leave the Sinai peninsula, Egyptian territory. Israel rejected the ultimatum, but Tiberias became famous. During the months after the occupation of the city, a large part of the buildings of the old city in Tiberias was destroyed, and this for various reasons - problems of hygiene, rickety construction, and the fear that the Arabs would return to the city, when it became known that this was a requirement of Jordan as part of the negotiations conducted in Rhodes. Finally, the authorities acceded to the initiative of the Jewish National Fund, Yosef Nahmani, who argued that the houses of the Old City should be demolished, despite the opposition of Mayor Shimon Dahan. The destruction began in the summer of 1948 and continued until the first months of 1949. A visit by David Ben-Gurion to the city brought an end to the destruction, after 477 out of 696 houses were destroyed according to official estimates. After the destruction remained the remains of the wall and the citadel, several houses on the outskirts of the city, as well as the two mosques that operated in the city. The area stood abandoned for decades, until operations began to restore it in the 1970s. The city of Tiberias has been almost entirely Jewish since 1948. Many Sephardic and Mizrahi Jews settled in the city, following the Jewish exodus from Arab countries in the late 1940s and early 1950s. Over time, government housing was built to accommodate much of the new population, like in many other development towns. In 1959, during Wadi Salib riots, the "Union des Nords-africains led by David Ben Haroush, organised a large-scale procession walking towards the nice suburbs of Haifa creating little damage but a great fear within the population. This small incident was taken as an occasion to express the social malaise of the different Oriental communities in Israel and riots spread quickly to other parts of the country; mostly in towns with a high percentage of the population having North African origins like in Tiberias, in Beer-Sheva, in Migdal-Haemek". Over time, the city came to rely on tourism, becoming a major Galilean center for Christian pilgrims and internal Israeli tourism. The ancient cemetery of Tiberias and its old synagogues are also drawing religious Jewish pilgrims during religious holidays. Tiberias consists of a small port on the shores of Galilee lake for both fishing and tourist activities. Since the 1990s, the importance of the port for fishing was gradually decreasing, with the decline of the Tiberias lake level, due to continuing droughts and increased pumping of fresh water from the lake. It was expected that the lake of Tiberias will regain its original level (almost 6 metres (20 feet) higher than today), with the full operational capacity of Israeli desalination facilities by 2014. In 2020, the lake raised above the level it was in 1990. In 2012, plans were announced for a new ultra-Orthodox neighborhood, Kiryat Sanz, on a slope on the western side of the Kinneret. Demographics According to the Central Bureau of Statistics (CBS), as of August 2023, 49,876 inhabitants lived in Tiberias. According to CBS, as of December 2019 the city was rated 4 out of 10 on the socio-economic scale. The average monthly salary of an employee for the year 2019 was 7,508 NIS. Among today's population of Jews, many are Mizrahi and Sephardic. The yearly growth rate of its population is 3.9%. Following Israel's withdrawal from Lebanon in 2000 many ex-South Lebanon Army soldiers and officers who fled from Lebanon settled in Tiberias with their families. In the Ottoman registers of 1525, 1533, 1548, 1553, and 1572 all the residents were Muslims. The registers in 1596 recorded the population to consist of 50 families and four bachelors, all Muslim. In 1780, there were about 4,000 inhabitants, two thirds being Jews.[citation needed] In 1842, there were about 3,900 inhabitants, around a third of whom were Jews, the rest being Muslims and a few Christians. In 1850, Tiberias contained three synagogues which served the Sephardi community, which consisted of 80 families, and the Ashkenazim, numbering about 100 families. It was reported that the Jewish inhabitants of Tiberias enjoyed more peace and security than those of Safed to the north. In 1863, it was recorded that the Christian and Muslim elements made up three-quarters of the population (2,000 to 4,000). A population list from about 1887 showed that Tiberias had a population of about 3,640; 2,025 Jews, 30 Latins, 215 Catholics, 15 Greek Catholics, and 1,355 Muslims. In 1901, the Jews of Tiberias numbered about 2,000 in a total population of 3,600. By 1912, the population reached 6,500. This included 4,500 Jews, 1,600 Muslims and 400 Christians. In the 1922 census of Palestine conducted by the British Mandate authorities, Tiberias had a population of 6,950 inhabitants, consisting of 4,427 Jews, 2,096 Muslims, 422 Christians, and five others. There were 5,381 Jews, 2,645 Muslims, 565 Christians and ten others in the 1931 census. By 1945, the population had increased to 6,000 Jews, 4,540 Muslims, 760 Christians with ten others. During the 1948 Arab-Israeli War, Palestinian Arab residents of Tiberias besieged its Jewish quarter. Haganah troops then successfully attacked the Arab section of the city, and British troops evacuated the Arab residents upon their request. Some fled in the wake of news of the Deir Yassin massacre. The entire Arab population of the city was removed in 1948 by the British and partly because of Haganah decision. After the war had ended, a large number of Jewish immigrants to Israel settled in Tiberias. Today almost all of the population is Jewish. In 2022, 93.3% of the population was Jewish and 6.7% was counted as other. Urban renewal and preservation Ancient and medieval Tiberias was destroyed by a series of devastating earthquakes, and much of what was built after the major earthquake of 1837 was destroyed or badly damaged in the great flood of 1934. Houses in the newer parts of town, uphill from the waterfront, survived. In 1949, 606 houses, comprising almost all of the built-up area of the old quarter other than religious buildings, were demolished over the objections of local Jews who owned about half the houses. Wide-scale development began after the Six-Day War, with the construction of a waterfront promenade, open parkland, shopping streets, restaurants and modern hotels. Carefully preserved were several churches, including one with foundations dating from the Crusader period, the city's two Ottoman-era mosques, and several ancient synagogues. The city's old masonry buildings constructed of local black basalt with white limestone windows and trim have been designated historic landmarks. Also preserved are parts of the ancient wall, the Ottoman-era citadel, historic hotels, Christian pilgrim hostels, convents and schools. Archaeology A 2,000 year-old Roman theatre was discovered 15 metres (49 feet) under layers of debris and refuse at the foot of Mount Bernike south of modern Tiberias. It once seated over 7,000 people. In 2004, excavations in Tiberias conducted by the Israel Antiquities Authority uncovered a structure dating to the 3rd century CE that may have been the seat of the Sanhedrin. At the time it was called Beit Hava'ad. In June 2018, an underground Jewish mausoleum was discovered. Archaeologists said that the mausoleum was between 1,900 to 2,000 years old as of 2018. The names of the dead were inscribed on the ossuaries in Greek. In January 2021, the foundations of a mosque dating to the earliest years of Muslim rule was excavated just south of the Sea of Galilee by archaeologists led by Katia Cytryn-Silverman from the Hebrew University of Jerusalem. Built around 670 CE, it is considered to have been the first purpose-built mosque in the city. Geography and climate Tiberias is located on the shore of the Sea of Galilee and the western slopes of the Jordan Rift Valley overlooking the lake, in the elevation range of −200 to 200 metres (−660–660 feet). Tiberias has a hot semi-arid climate (Köppen: BSh) that borders a hot-summer Mediterranean climate (Köppen: Csa), with an annual precipitation of 437.1 mm (17.21 in). Summers in Tiberias average a maximum temperature of 38 °C (100 °F) and a minimum temperature of 25 °C (77 °F) in July and August. The winters are mild, with temperatures ranging from 10 to 18 °C (50–64 °F). Extremes have ranged from 0 °C (32 °F) to 48 °C (118 °F). Tiberias has been severely damaged by earthquakes since antiquity. Earthquakes are known to have occurred in 30, 33, 115, 306, 363, 419, 447, 631–32 (aftershocks continued for a month), 1033, 1182, 1202, 1546, 1759, 1837, 1927 and 1943. The city is located above the Dead Sea Transform and is one of the cities in Israel that is most at risk to earthquakes (along with Safed, Beit She'an, Kiryat Shmona, and Eilat). Health care In 1885, a Scottish doctor and minister, David Watt Torrance, opened a mission hospital in Tiberias that accepted patients of all races and religions. In 1894, it moved to larger premises at Beit abu Shamnel abu Hannah. David Watt Torrance died in Tiberias in 1923. The same year his son, Dr. Herbert Watt Torrance, was appointed head of the hospital. In 1949, following the establishment of the State of Israel, it became a maternity hospital supervised by the Israeli Department of Health. After its closure in 1959, the building became a guesthouse until 1999, when it was renovated and reopened as the Scots Hotel. Poria hospital is located near Upper Tiberias neighborhood, and operates a hospitalization control center in the city itself. Culture Tiberia due to its pictureque settings and its mix of water and mountain has attracted numerous artists throughout the years. The painter Shimshon Holzman did several water colours of the city which he compiled into an exhibition and set up a studio in Tiberias in 1950. In 2010 a sculpture museum was opened in Tiberias. Its first football club established in 1925 was Maccabi Tiberias, but folded in the 1990s after financial difficulties. Hapoel Tiberias represented the city in the top division of football for several seasons in the 1960s and 1980s, but eventually dropped into the regional leagues and folded due to financial difficulties. Following Hapoel's demise, a new club, Ironi Tiberias, was established, which currently plays in Liga Leumit. 6 Nations Championship and Heineken Cup winner Jamie Heaslip was born in Tiberias. The Tiberias Marathon is an annual road race held along the Sea of Galilee in Israel with a field in recent years of approximately 1000 competitors. The course follows an out-and-back format around the southern tip of the sea, and was run concurrently with a 10k race along an abbreviated version of the same route. In 2010 the 10k race was moved to the afternoon before the marathon. At approximately 200 metres (660 feet) below sea level, this is the lowest course in the world. Twin towns – sister cities Tiberias is twinned with: Notable people Prominent people predating the State of Israel, listed by year of birth: Prominent people in the State of Israel or born/active there, listed alphabetically: See also References Bibliography External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Claude_Reignier_Conder] | [TOKENS: 419]
Contents Claude Reignier Conder Claude Reignier Conder (29 December 1848 – 16 February 1910) was a British Army officer, explorer and antiquarian. He was a great-great-grandson of Louis-François Roubiliac and grandson of editor and author Josiah Conder. Life Conder was educated at University College London and the Royal Military Academy, Woolwich. He became a lieutenant in the Corps of Royal Engineers in 1870. He carried out survey work in Palestine in 1872–1874, latterly in conjunction with Lt Kitchener, later Lord Kitchener, whom he had met at school, and was seconded to the Palestine Exploration Fund from 1875 to 1878 and again in 1881 and 1882, when he was promoted to captain. He retired with the rank of colonel in 1904. Conder joined the expedition to Egypt in 1882, under Sir Garnet Wolseley, to suppress the rebellion of Urabi Pasha. He was appointed a deputy assistant adjutant and quartermaster-general on the staff of the intelligence department. In Egypt his perfect knowledge of Arabic and of Eastern people proved most useful. He was present at the action of Kassassin, the Battle of Tel el-Kebir, and the advance to Cairo, but then, seized with typhoid fever, he was invalided home. For his services he received the war medal with clasp for Tel el-Kebir, the Khedive's bronze star and the fourth class of the Order of the Medjidie.[citation needed] While surveying the area of Safed in July 1875, Conder and his party were attacked by local residents and Conder sustained a serious head injury which left him bedridden for a while and unable to return to Palestine. The work of surveying the country of Palestine commenced again only in late February 1877, without Conder. Publications Books (with online access) Articles (with online access) References External links
========================================